The muHVT package is a collection of R functions to facilitate building topology preserving maps for rich multivariate data analysis. Tending towards a big data preponderance, a large number of rows. A collection of R functions for this typical workflow is organized below:
Data Compression: Vector quantization (VQ), HVQ (hierarchical vector quantization) using means or medians. This step compresses the rows (long data frame) using a compression objective.
Data Projection: Dimension projection of the compressed cells to 1D,2D or 3D with the Sammons Non-linear Algorithm. This step creates topology preserving map (also called as embedding) coordinates into the desired output dimension.
Tessellation: Create cells required for object visualization using the Voronoi Tessellation method, package includes heatmap plots for hierarchical Voronoi tessellations (HVT). This step enables data insights, visualization, and interaction with the topology preserving map. Useful for semi-supervised tasks.
Prediction: Scoring new data sets and recording their assignment using the map objects from the above steps, in a sequence of maps if required.
Data Understanding
In this vignette, we will use the
Prices of Personal Computers dataset. This
dataset contains 6259 observations and 6 features. The dataset observes
the price from 1993 to 1995 of 486 personal computers in the US. The
variables are price, speed, hd, ram, screen and ads.
Here, we load the data and store into a variable
computers.
set.seed(240)
# Load data from csv files
computers <- read.csv("https://raw.githubusercontent.com/Mu-Sigma/muHVT/master/vignettes/sample_dataset/Computers.csv")Raw Personal Computers Dataset
The Computers dataset includes the following columns:
Let’s explore the Personal Computers Dataset containing (6259 points). For the shake of brevity we are displaying first six rows.
Table(head(computers), scroll = T, limit = 20)| X | price | speed | hd | ram | screen | cd | multi | premium | ads | trend |
|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 1499 | 25 | 80 | 4 | 14 | no | no | yes | 94 | 1 |
| 2 | 1795 | 33 | 85 | 2 | 14 | no | no | yes | 94 | 1 |
| 3 | 1595 | 25 | 170 | 4 | 15 | no | no | yes | 94 | 1 |
| 4 | 1849 | 25 | 170 | 8 | 14 | no | no | no | 94 | 1 |
| 5 | 3295 | 33 | 340 | 16 | 14 | no | no | yes | 94 | 1 |
| 6 | 3695 | 66 | 340 | 16 | 14 | no | no | yes | 94 | 1 |
Now, let us check the structure of the data and analyse its summary.
str(computers)
#> 'data.frame': 6259 obs. of 11 variables:
#> $ X : int 1 2 3 4 5 6 7 8 9 10 ...
#> $ price : int 1499 1795 1595 1849 3295 3695 1720 1995 2225 2575 ...
#> $ speed : int 25 33 25 25 33 66 25 50 50 50 ...
#> $ hd : int 80 85 170 170 340 340 170 85 210 210 ...
#> $ ram : int 4 2 4 8 16 16 4 2 8 4 ...
#> $ screen : int 14 14 15 14 14 14 14 14 14 15 ...
#> $ cd : chr "no" "no" "no" "no" ...
#> $ multi : chr "no" "no" "no" "no" ...
#> $ premium: chr "yes" "yes" "yes" "no" ...
#> $ ads : int 94 94 94 94 94 94 94 94 94 94 ...
#> $ trend : int 1 1 1 1 1 1 1 1 1 1 ...summary(computers)
#> X price speed hd
#> Min. : 1 Min. : 949 Min. : 25.00 Min. : 80.0
#> 1st Qu.:1566 1st Qu.:1794 1st Qu.: 33.00 1st Qu.: 214.0
#> Median :3130 Median :2144 Median : 50.00 Median : 340.0
#> Mean :3130 Mean :2220 Mean : 52.01 Mean : 416.6
#> 3rd Qu.:4694 3rd Qu.:2595 3rd Qu.: 66.00 3rd Qu.: 528.0
#> Max. :6259 Max. :5399 Max. :100.00 Max. :2100.0
#> ram screen cd multi
#> Min. : 2.000 Min. :14.00 Length:6259 Length:6259
#> 1st Qu.: 4.000 1st Qu.:14.00 Class :character Class :character
#> Median : 8.000 Median :14.00 Mode :character Mode :character
#> Mean : 8.287 Mean :14.61
#> 3rd Qu.: 8.000 3rd Qu.:15.00
#> Max. :32.000 Max. :17.00
#> premium ads trend
#> Length:6259 Min. : 39.0 Min. : 1.00
#> Class :character 1st Qu.:162.5 1st Qu.:10.00
#> Mode :character Median :246.0 Median :16.00
#> Mean :221.3 Mean :15.93
#> 3rd Qu.:275.0 3rd Qu.:21.50
#> Max. :339.0 Max. :35.00Let us first split the data into train and test. We will randomly select 80% of the data for training and remaining as testing.
num_rows <- nrow(computers)
set.seed(123)
train_indices <- sample(1:num_rows, 0.8 * num_rows)
trainComputers <- computers[train_indices, ]
testComputers <- computers[-train_indices, ]K-means is not suitable for factor variables as the sample space for factor variables is discrete. A Euclidean distance function on such a space isn’t really meaningful. Hence, we will delete the factor variables(X, cd, multi, premium, trend) in our dataset.
trainComputers <-
trainComputers %>% dplyr::select(-c(X, cd, multi, premium, trend))
testComputers <-
testComputers %>% dplyr::select(-c(X, cd, multi, premium, trend))Raw Training Dataset
Now, lets have a look at the randomly selected raw training dataset containing (5007 data points). For the sake of brevity we are displaying first six rows.
trainComputers_data <- trainComputers %>% as.data.frame() %>% round(4)
trainComputers_data$Row.No <- as.numeric(row.names(trainComputers_data))
trainComputers_data <- trainComputers_data %>% dplyr::select(Row.No,price,speed,hd,ram,screen,ads)
row.names(trainComputers_data) <- NULL
Table(head(trainComputers_data))| Row.No | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|
| 2463 | 2799 | 50 | 230 | 8 | 15 | 216 |
| 2511 | 2197 | 33 | 270 | 4 | 14 | 216 |
| 2227 | 2744 | 50 | 340 | 8 | 17 | 275 |
| 526 | 2999 | 66 | 245 | 16 | 15 | 139 |
| 4291 | 1974 | 33 | 200 | 4 | 14 | 248 |
| 2986 | 2490 | 33 | 528 | 16 | 14 | 267 |
Raw Testing Dataset
Now, lets have a look at the randomly selected raw testing dataset containing (1252 data points). For the sake of brevity we are displaying first six rows.
#testComputers <- scale(testComputers, center = scale_attr$`scaled:center`, scale = scale_attr$`scaled:scale`)
testComputers_data <- testComputers %>% as.data.frame() %>% round(4)
testComputers_data$Row.No <- as.numeric(row.names(testComputers_data))
testComputers_data <- testComputers_data %>% dplyr::select(Row.No,price,speed,hd,ram,screen,ads)
rownames(testComputers_data) <- NULL
Table(head(testComputers_data))| Row.No | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|
| 3 | 1595 | 25 | 170 | 4 | 15 | 94 |
| 4 | 1849 | 25 | 170 | 8 | 14 | 94 |
| 7 | 1720 | 25 | 170 | 4 | 14 | 94 |
| 10 | 2575 | 50 | 210 | 4 | 15 | 94 |
| 11 | 2195 | 33 | 170 | 8 | 15 | 94 |
| 14 | 2295 | 25 | 245 | 8 | 14 | 94 |
Let us try to visualize the compressed Map A from the flow diagram below.
Figure 1: Flow map with highlighted bounding box in red around compressed map A
This package can perform vector quantization using the following algorithms -
For more information on vector quantization, refer the following link.
The HVT function constructs highly compressed hierarchical Voronoi tessellations. The raw data is first scaled and this scaled data is supplied as input to the vector quantization algorithm. The vector quantization algorithm compresses the dataset until a user-defined compression percentage/rate is achieved using a parameter called quantization error which acts as a threshold and determines the compression percentage. It means that for a given user-defined compression percentage we get the ‘n’ number of cells, then all of these cells formed will have a quantization error less than the threshold quantization error.
Let’s try to comprehend the HVT function first before moving ahead.
HVT(
dataset,
min_compression_perc,
n_cells,
depth,
quant.err,
distance_metric = c("L1_Norm", "L2_Norm"),
error_metric = c("mean", "max"),
quant_method = c("kmeans", "kmedoids"),
normalize = TRUE,
diagnose = FALSE,
hvt_validation = FALSE,
train_validation_split_ratio = 0.8
)Each of the parameters have been explained below :
dataset - A dataframe with numeric
columns.
min_compression_perc - An integer
indicating the minimum percent compression rate to be achieved for the
dataset.
n_cells - An integer indicating the
number of cells per hierarchy (level).
depth - An integer indicating the
number of levels. (1 = No hierarchy, 2 = 2 levels, etc …).
quant.error - A number indicating
the quantization error threshold. A cell will only breakdown. into
further cells if the quantization error of the cell is above the defined
quantization error threshold.
projection.scale - A number
indicating the scale factor for the tesselations so as to visualize the
sub-tesselations well enough.
scale_summary - A list with mean
and standard deviation values for all the features in the dataset. Pass
the scale summary when the input dataset is already scaled or normalize
is set to False.
distance_metric - The distance
metric can be L1_Norm or L2_Norm.
L1_Norm is selected by default. The distance metric is used
to calculate the distance between an n dimensional point
and centroid. The user can also pass a custom function to calculate this
distance.
error_metric - The error metric can
be mean or max. max is selected
by default. max will return the max of m
values and mean will take mean of m values
where each value is a distance between a point and centroid of the cell.
Moreover, the user can also pass a custom function to calculate the
error metric.
quant_method - The quantization
method can be kmeans or kmedoids.
kmeans is selected by default.
normalize - A logical value
indicating whether the columns in your dataset need to be normalized.
Default value is TRUE. The algorithm supports Z-score
normalization.
diagnose - A logical value
indicating whether user wants to perform diagnostics on the model.
Default value is TRUE.
hvt_validation - A logical value
indicating whether user wants to holdout a validation set and find mean
absolute deviation of the validation points from the centroid. Default
value is FALSE.
train_validation_split_ratio - A
numeric value indicating train validation split ratio. This argument is
only used when hvt_validation has been set to TRUE. Default value for
the argument is 0.8.
We will use the HVT function to compress our data while
preserving essential features of the dataset. Our goal is to achieve
data compression upto atleast 80%. In situations where the
compression ratio does not meet the desired target, we can explore
adjusting the model parameters as a potential solution. This involves
making modifications to parameters such as the
quantization error threshold or
increasing the number of cells and then rerunning the HVT
function again.
In our example we will iteratively increase the number of cells until the desired compression percentage is reached instead of increasing the quantization threshold because it may reduce the level of detail captured in the data representation
First, we will construct map A by using
the below mentioned model parameters.
We will pass the below mentioned model parameters along with training
dataset to HVT function.
Model Parameters
set.seed(240)
map_A <- list()
map_A <- muHVT::HVT(
trainComputers,
n_cells = 200,
depth = 1,
quant.err = 0.2,
projection.scale = 10,
normalize = T,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans"
)Let’s checkout the compression summary.
compressionSummaryTable(map_A[[3]]$compression_summary)| segmentLevel | noOfCells | noOfCellsBelowQuantizationError | percentOfCellsBelowQuantizationErrorThreshold | parameters |
|---|---|---|---|---|
| 1 | 200 | 83 | 0.42 | n_cells: 200 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
As it can be seen from the table above, 42% of cells have reached the quantization threshold error. Therefore we can further subdivide the cells by increasing the n_cells parameters and then see if desired compression (80%) is reached
Since, we are yet to achive atleast 80% compression. Let’s try to compress again using the below mentioned set of model parameters.
Model Parameters
map_A <- list()
map_A <-muHVT::HVT(trainComputers,
n_cells = 440,
quant.err = 0.2,
depth = 1,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans",
normalize = T)As per the manual, map_A[[3]] gives us
detailed information about the hierarchical vector quantized data.
map_A[[3]][['summary']] gives a nice
tabular data containing no of points, Quantization Error and the
codebook.
The datatable displayed below is the summary from map A
summaryTable(map_A[[3]]$summary,scroll = T,limit = 500)| Segment.Level | Segment.Parent | Segment.Child | n | Cell.ID | Quant.Error | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 1 | 1 | 7 | 46 | 0.08 | -0.76 | -0.89 | -0.88 | -0.76 | -0.67 | 1.57 |
| 1 | 1 | 2 | 10 | 108 | 0.08 | -0.80 | -0.89 | -0.16 | -0.76 | -0.67 | 0.67 |
| 1 | 1 | 3 | 15 | 223 | 0.12 | 0.37 | -0.89 | -0.72 | -0.05 | 0.43 | -1.65 |
| 1 | 1 | 4 | 11 | 54 | 0.07 | -1.50 | -0.89 | -0.75 | -0.76 | -0.67 | 0.62 |
| 1 | 1 | 5 | 8 | 146 | 0.13 | -0.31 | 0.68 | -0.95 | -0.89 | -0.67 | -0.14 |
| 1 | 1 | 6 | 11 | 150 | 0.16 | -0.66 | 0.68 | -0.78 | -0.79 | -0.67 | -0.73 |
| 1 | 1 | 7 | 11 | 170 | 0.1 | 0.03 | -1.24 | -0.13 | -0.05 | -0.67 | 0.38 |
| 1 | 1 | 8 | 8 | 334 | 0.15 | 0.62 | 2.30 | 0.08 | -0.05 | 0.43 | 0.04 |
| 1 | 1 | 9 | 8 | 114 | 0.07 | -0.16 | 0.68 | -1.19 | -1.11 | -0.67 | 0.87 |
| 1 | 1 | 10 | 7 | 248 | 0.17 | 0.51 | -0.08 | 0.34 | -0.05 | -0.67 | -0.30 |
| 1 | 1 | 11 | 9 | 140 | 0.12 | -0.01 | 0.68 | -1.15 | -1.00 | -0.67 | 0.33 |
| 1 | 1 | 12 | 7 | 219 | 0.14 | -1.36 | 0.24 | 0.46 | -0.05 | -0.67 | -0.74 |
| 1 | 1 | 13 | 9 | 271 | 0.05 | -1.08 | 0.68 | 0.49 | -0.05 | 0.43 | -0.84 |
| 1 | 1 | 14 | 19 | 109 | 0.06 | -0.31 | -0.89 | -0.74 | -0.76 | -0.67 | 0.38 |
| 1 | 1 | 15 | 6 | 176 | 0.08 | -0.72 | -0.89 | -0.07 | -0.05 | 0.43 | 0.72 |
| 1 | 1 | 16 | 17 | 332 | 0.14 | 0.42 | 2.30 | 0.10 | -0.05 | 0.43 | 1.50 |
| 1 | 1 | 17 | 12 | 18 | 0.05 | -1.21 | -1.27 | -1.19 | -1.11 | -0.67 | 0.97 |
| 1 | 1 | 18 | 19 | 149 | 0.16 | -0.68 | -0.08 | -0.46 | -0.76 | 0.43 | 0.79 |
| 1 | 1 | 19 | 17 | 428 | 0.35 | 0.18 | 2.30 | 2.53 | 1.37 | 0.43 | -2.22 |
| 1 | 1 | 20 | 20 | 320 | 0.36 | 0.82 | -0.16 | -0.09 | -0.12 | 2.64 | 0.71 |
| 1 | 1 | 21 | 3 | 305 | 0.18 | 2.27 | -0.35 | -0.01 | -0.05 | -0.67 | -1.32 |
| 1 | 1 | 22 | 7 | 227 | 0.1 | -0.51 | -0.89 | 0.45 | -0.05 | 0.43 | -0.47 |
| 1 | 1 | 23 | 10 | 178 | 0.12 | 0.00 | 0.68 | -0.86 | -0.76 | -0.67 | -0.90 |
| 1 | 1 | 24 | 9 | 365 | 0.1 | 0.68 | -0.08 | 1.20 | 1.37 | 0.43 | -0.36 |
| 1 | 1 | 25 | 5 | 14 | 0.11 | -1.99 | -0.89 | -0.96 | -1.11 | -0.67 | 0.18 |
| 1 | 1 | 26 | 3 | 411 | 0.05 | 1.25 | -0.89 | 2.29 | 2.79 | 0.43 | 0.57 |
| 1 | 1 | 27 | 18 | 122 | 0.15 | -0.18 | -0.98 | -0.85 | -0.76 | 0.43 | 0.68 |
| 1 | 1 | 28 | 15 | 189 | 0.11 | 0.40 | -0.92 | 0.03 | -0.05 | -0.67 | 0.87 |
| 1 | 1 | 29 | 11 | 107 | 0.11 | -0.49 | -0.96 | -0.88 | -0.76 | -0.67 | -0.64 |
| 1 | 1 | 30 | 7 | 423 | 0.47 | 3.55 | 0.12 | 2.51 | 1.37 | -0.67 | 0.44 |
| 1 | 1 | 31 | 14 | 90 | 0.05 | -0.63 | -0.89 | -0.79 | -0.76 | -0.67 | 0.58 |
| 1 | 1 | 32 | 22 | 430 | 0.24 | 0.63 | 0.75 | 3.07 | 2.79 | 0.43 | -2.27 |
| 1 | 1 | 33 | 5 | 390 | 0.3 | 1.37 | -0.89 | 3.73 | -0.19 | -0.45 | 0.70 |
| 1 | 1 | 34 | 25 | 101 | 0.18 | -0.85 | -0.97 | -0.71 | -0.76 | 0.43 | 0.84 |
| 1 | 1 | 35 | 11 | 425 | 0.07 | 0.15 | 2.30 | 1.70 | 1.37 | 0.43 | -2.39 |
| 1 | 1 | 36 | 10 | 358 | 0.05 | 0.24 | -0.89 | 1.20 | 1.37 | 0.43 | -0.84 |
| 1 | 1 | 37 | 16 | 166 | 0.11 | 0.03 | 0.68 | -1.08 | -0.78 | -0.67 | -1.65 |
| 1 | 1 | 38 | 13 | 45 | 0.05 | -0.91 | -0.89 | -1.19 | -1.11 | -0.67 | 0.42 |
| 1 | 1 | 39 | 8 | 383 | 0.12 | 1.15 | 2.30 | 0.45 | 1.37 | -0.67 | -0.16 |
| 1 | 1 | 40 | 5 | 9 | 0.07 | -1.24 | -0.97 | -1.19 | -1.11 | -0.67 | 1.57 |
| 1 | 1 | 41 | 11 | 419 | 0.06 | 1.41 | -0.89 | 2.29 | 2.79 | 0.43 | -0.82 |
| 1 | 1 | 42 | 8 | 242 | 0.14 | -0.81 | -0.08 | 0.30 | -0.05 | 0.43 | -0.65 |
| 1 | 1 | 43 | 13 | 179 | 0.09 | 0.41 | 0.68 | -0.76 | -0.76 | -0.67 | 0.40 |
| 1 | 1 | 44 | 5 | 375 | 0.04 | 0.06 | -0.08 | 1.70 | 1.37 | 0.43 | -0.79 |
| 1 | 1 | 45 | 20 | 129 | 0.14 | -1.15 | 0.68 | -0.79 | -0.76 | -0.67 | -0.40 |
| 1 | 1 | 46 | 10 | 292 | 0.22 | 0.86 | 0.68 | -0.65 | -0.12 | 0.43 | -1.41 |
| 1 | 1 | 47 | 5 | 79 | 0.12 | -0.89 | -1.04 | -0.94 | -0.05 | -0.67 | 1.02 |
| 1 | 1 | 48 | 23 | 246 | 0.11 | -0.42 | 0.68 | 0.46 | -0.05 | -0.67 | -0.63 |
| 1 | 1 | 49 | 8 | 207 | 0.25 | 0.74 | -0.89 | -0.40 | -0.40 | 0.43 | 0.52 |
| 1 | 1 | 50 | 11 | 27 | 0.06 | -1.06 | -1.27 | -1.19 | -1.11 | -0.67 | 0.43 |
| 1 | 1 | 51 | 8 | 51 | 0.11 | -1.23 | -0.08 | -1.05 | -0.85 | -0.67 | 0.94 |
| 1 | 1 | 52 | 19 | 288 | 0.09 | 0.81 | -0.89 | 0.45 | 1.37 | -0.67 | 0.88 |
| 1 | 1 | 53 | 7 | 154 | 0.1 | -0.62 | 0.68 | -0.15 | -0.76 | -0.67 | 0.84 |
| 1 | 1 | 54 | 10 | 261 | 0.15 | 0.61 | -0.08 | -0.67 | -0.05 | 0.43 | -1.40 |
| 1 | 1 | 55 | 10 | 195 | 0.15 | 0.18 | 0.77 | -0.09 | -0.76 | -0.67 | 0.83 |
| 1 | 1 | 56 | 14 | 250 | 0.09 | 0.52 | 0.68 | -0.69 | -0.05 | -0.67 | -1.61 |
| 1 | 1 | 57 | 20 | 331 | 0.2 | -0.63 | 0.68 | 0.30 | -0.76 | 2.64 | -0.95 |
| 1 | 1 | 58 | 9 | 379 | 0.15 | 1.33 | -0.08 | -0.65 | -0.29 | 2.64 | -1.52 |
| 1 | 1 | 59 | 14 | 11 | 0.21 | -0.28 | -1.05 | -0.79 | -0.76 | 2.64 | 0.53 |
| 1 | 1 | 60 | 29 | 359 | 0.13 | 1.36 | 0.68 | 0.18 | 1.37 | 0.43 | 0.77 |
| 1 | 1 | 61 | 6 | 337 | 0.1 | 2.46 | 0.68 | 0.21 | -0.05 | -0.67 | -0.87 |
| 1 | 1 | 62 | 6 | 1 | 0.17 | -0.17 | -1.21 | -1.02 | -0.76 | 2.64 | 1.32 |
| 1 | 1 | 63 | 28 | 243 | 0.33 | -0.33 | 2.30 | -0.23 | -0.46 | -0.67 | -0.88 |
| 1 | 1 | 64 | 8 | 274 | 0.07 | 0.41 | -1.27 | 0.45 | 1.37 | -0.67 | 0.68 |
| 1 | 1 | 65 | 13 | 362 | 0.14 | 1.07 | 0.75 | 0.35 | 1.37 | 0.43 | 1.34 |
| 1 | 1 | 66 | 10 | 143 | 0.07 | -0.34 | -0.89 | -0.80 | -0.05 | -0.67 | -1.66 |
| 1 | 1 | 67 | 4 | 265 | 0.05 | -0.55 | 0.68 | 1.23 | -0.05 | -0.67 | -0.69 |
| 1 | 1 | 68 | 11 | 13 | 0.15 | -0.83 | -0.89 | -0.25 | -0.76 | 2.64 | -0.33 |
| 1 | 1 | 69 | 8 | 298 | 0.17 | -0.62 | 0.20 | 2.29 | -0.05 | -0.67 | -0.95 |
| 1 | 1 | 70 | 4 | 335 | 0.06 | 1.34 | -0.08 | 0.45 | 1.37 | -0.67 | -0.08 |
| 1 | 1 | 71 | 20 | 204 | 0.16 | 0.09 | -0.08 | 0.02 | -0.05 | -0.67 | 0.86 |
| 1 | 1 | 72 | 10 | 42 | 0.06 | -1.49 | -0.89 | -0.75 | -0.76 | -0.67 | 1.04 |
| 1 | 1 | 73 | 1 | 429 | 0 | 3.08 | 0.68 | 0.04 | 4.20 | 0.43 | 0.71 |
| 1 | 1 | 74 | 14 | 186 | 0.14 | -0.79 | -0.89 | 0.45 | -0.05 | -0.67 | -0.68 |
| 1 | 1 | 75 | 4 | 410 | 0.37 | 2.27 | 0.68 | 3.73 | -0.23 | -0.40 | 0.68 |
| 1 | 1 | 76 | 9 | 163 | 0.16 | 1.05 | -0.89 | -0.41 | -0.60 | -0.67 | 0.61 |
| 1 | 1 | 77 | 10 | 400 | 0.07 | -0.03 | 0.68 | 1.70 | 1.37 | 0.43 | -2.38 |
| 1 | 1 | 78 | 6 | 275 | 0.18 | 1.14 | 0.68 | 0.13 | -0.05 | -0.67 | -0.18 |
| 1 | 1 | 79 | 25 | 241 | 0.16 | -0.88 | 0.68 | 0.40 | -0.05 | -0.67 | -1.06 |
| 1 | 1 | 80 | 6 | 245 | 0.14 | -1.22 | 0.68 | -0.30 | -0.05 | 0.43 | -0.91 |
| 1 | 1 | 81 | 21 | 120 | 0.16 | -0.46 | -0.08 | -0.82 | -0.62 | -0.67 | 0.70 |
| 1 | 1 | 82 | 11 | 40 | 0.18 | -0.93 | -0.99 | -1.19 | -1.11 | 0.43 | 0.37 |
| 1 | 1 | 83 | 9 | 342 | 0.28 | 1.16 | 0.43 | -0.52 | -0.68 | 2.64 | 1.12 |
| 1 | 1 | 84 | 8 | 286 | 0.05 | -0.72 | 1.11 | 0.50 | -0.05 | 0.43 | -0.83 |
| 1 | 1 | 85 | 8 | 33 | 0.1 | -1.06 | -0.89 | -1.18 | -1.02 | -0.67 | -0.99 |
| 1 | 1 | 86 | 5 | 282 | 0.23 | -1.50 | 0.53 | 0.19 | -0.48 | 0.43 | -2.16 |
| 1 | 1 | 87 | 17 | 137 | 0.07 | -0.31 | 0.68 | -0.78 | -0.76 | -0.67 | 0.96 |
| 1 | 1 | 88 | 19 | 168 | 0.07 | -0.08 | -0.89 | 0.05 | -0.05 | -0.67 | 1.04 |
| 1 | 1 | 89 | 7 | 291 | 0.05 | 1.10 | -0.89 | 0.15 | 1.37 | -0.67 | 0.36 |
| 1 | 1 | 90 | 6 | 24 | 0.19 | -0.97 | -1.02 | -1.07 | -0.94 | 0.43 | -1.40 |
| 1 | 1 | 91 | 4 | 434 | 0.59 | 4.07 | 1.49 | 1.29 | 0.30 | 2.64 | 0.07 |
| 1 | 1 | 92 | 19 | 409 | 0.65 | 0.92 | 0.42 | 1.41 | 1.37 | 2.64 | -0.60 |
| 1 | 1 | 93 | 9 | 393 | 0.46 | 2.04 | 2.30 | 0.91 | -0.05 | -0.31 | -0.34 |
| 1 | 1 | 94 | 8 | 22 | 0.08 | -1.66 | -1.27 | -0.84 | -0.76 | -0.67 | 0.84 |
| 1 | 1 | 95 | 23 | 158 | 0.22 | -1.41 | 0.68 | -0.20 | -0.70 | -0.67 | -1.05 |
| 1 | 1 | 96 | 11 | 330 | 0.22 | 1.55 | 0.68 | -0.45 | -0.31 | 0.43 | -1.59 |
| 1 | 1 | 97 | 6 | 145 | 0.17 | 0.49 | -0.89 | -0.64 | -0.76 | -0.67 | -0.17 |
| 1 | 1 | 98 | 12 | 121 | 0.15 | 0.21 | -0.89 | -0.80 | -0.76 | -0.67 | -1.70 |
| 1 | 1 | 99 | 14 | 329 | 0.24 | 2.06 | 0.63 | 0.31 | -0.25 | 0.43 | 0.99 |
| 1 | 1 | 100 | 16 | 299 | 0.05 | 1.21 | -0.89 | 0.46 | 1.37 | -0.67 | 0.86 |
| 1 | 1 | 101 | 5 | 328 | 0.1 | -0.87 | 1.11 | 1.33 | -0.05 | 0.43 | -1.13 |
| 1 | 1 | 102 | 10 | 278 | 0.1 | 0.33 | -0.93 | 0.45 | 1.37 | -0.67 | 0.02 |
| 1 | 1 | 103 | 5 | 102 | 0.1 | -1.05 | 0.68 | -0.76 | -0.76 | -0.67 | 1.27 |
| 1 | 1 | 104 | 12 | 25 | 0.08 | -1.25 | -1.05 | -0.78 | -0.76 | -0.67 | 1.57 |
| 1 | 1 | 105 | 9 | 385 | 0.25 | 2.05 | 0.34 | -0.21 | -0.05 | 2.64 | 1.12 |
| 1 | 1 | 106 | 10 | 231 | 0.18 | -0.02 | -0.08 | 0.02 | -0.05 | 0.43 | 1.29 |
| 1 | 1 | 107 | 5 | 193 | 0.14 | -0.66 | -0.08 | -0.33 | -0.05 | -0.67 | -1.01 |
| 1 | 1 | 108 | 4 | 418 | 0.06 | -0.03 | 0.68 | 3.07 | 1.37 | 0.43 | -2.25 |
| 1 | 1 | 109 | 4 | 306 | 0.16 | 1.83 | 0.68 | -0.59 | -0.05 | -0.67 | -1.57 |
| 1 | 1 | 110 | 5 | 378 | 0.15 | 1.09 | 0.68 | 1.20 | 1.37 | 0.43 | -0.11 |
| 1 | 1 | 111 | 2 | 308 | 0.07 | -0.55 | 0.68 | 0.26 | 1.37 | -0.67 | -1.25 |
| 1 | 1 | 112 | 18 | 239 | 0.15 | 0.44 | 0.75 | 0.08 | -0.05 | -0.67 | 1.05 |
| 1 | 1 | 113 | 11 | 62 | 0.06 | -1.14 | -0.89 | -0.81 | -0.76 | -0.67 | 0.73 |
| 1 | 1 | 114 | 5 | 169 | 0.07 | -0.26 | -0.89 | 0.47 | -0.05 | -0.67 | 1.40 |
| 1 | 1 | 115 | 18 | 415 | 0.11 | 0.74 | -0.08 | 2.29 | 2.79 | 0.43 | -0.93 |
| 1 | 1 | 116 | 5 | 97 | 0.13 | 0.23 | -1.19 | -1.05 | -0.76 | -0.67 | 0.77 |
| 1 | 1 | 117 | 24 | 19 | 0.22 | -0.06 | 2.30 | -0.89 | -0.89 | -0.67 | 1.18 |
| 1 | 1 | 118 | 9 | 184 | 0.17 | -0.98 | 0.34 | -0.29 | -0.05 | -0.67 | -0.08 |
| 1 | 1 | 119 | 11 | 43 | 0.12 | -0.82 | -0.89 | -1.04 | -0.79 | -0.67 | -1.65 |
| 1 | 1 | 120 | 9 | 257 | 0.29 | 2.17 | -0.62 | 0.10 | -0.05 | -0.55 | 0.65 |
| 1 | 1 | 121 | 27 | 155 | 0.18 | 0.15 | 0.68 | -0.84 | -0.76 | -0.67 | 0.81 |
| 1 | 1 | 122 | 20 | 215 | 0.11 | 0.33 | -0.08 | -0.69 | -0.05 | -0.67 | -1.64 |
| 1 | 1 | 123 | 14 | 348 | 0.24 | 0.61 | 0.74 | 0.37 | 1.37 | 0.43 | 0.25 |
| 1 | 1 | 124 | 13 | 162 | 0.12 | 0.01 | -1.01 | -0.67 | -0.05 | -0.67 | -1.57 |
| 1 | 1 | 125 | 7 | 366 | 0.22 | 1.62 | 0.35 | -0.29 | 1.37 | -0.67 | -1.65 |
| 1 | 1 | 126 | 9 | 77 | 0.02 | -0.88 | -0.89 | -0.78 | -0.76 | -0.67 | 0.66 |
| 1 | 1 | 127 | 25 | 253 | 0.14 | 0.70 | 0.68 | 0.16 | -0.05 | -0.67 | 0.61 |
| 1 | 1 | 128 | 9 | 84 | 0.09 | -0.27 | -1.27 | -0.81 | -0.76 | -0.67 | 0.81 |
| 1 | 1 | 129 | 3 | 398 | 0.04 | -0.03 | -0.08 | 2.29 | 1.37 | 0.43 | -1.98 |
| 1 | 1 | 130 | 9 | 412 | 0.05 | 0.90 | -0.89 | 2.29 | 2.79 | 0.43 | -0.48 |
| 1 | 1 | 131 | 8 | 309 | 0.06 | 1.43 | -0.89 | 0.41 | 1.37 | -0.67 | 0.46 |
| 1 | 1 | 132 | 8 | 267 | 0.08 | 0.10 | 0.68 | 0.37 | -0.05 | 0.43 | 1.57 |
| 1 | 1 | 133 | 16 | 252 | 0.22 | 0.94 | -0.08 | -0.40 | -0.14 | 0.43 | 0.43 |
| 1 | 1 | 134 | 15 | 119 | 0.13 | -0.60 | -0.99 | -0.75 | -0.76 | 0.43 | 0.24 |
| 1 | 1 | 135 | 13 | 48 | 0.09 | -0.66 | -0.92 | -1.19 | -1.11 | -0.67 | 0.82 |
| 1 | 1 | 136 | 9 | 326 | 0.11 | 1.45 | -0.08 | 0.32 | 1.37 | -0.67 | 0.38 |
| 1 | 1 | 137 | 11 | 386 | 0.22 | 1.37 | 0.68 | -0.66 | -0.18 | 2.64 | -1.36 |
| 1 | 1 | 138 | 20 | 255 | 0.15 | 0.43 | -0.89 | -0.34 | -0.05 | 2.64 | 0.68 |
| 1 | 1 | 139 | 8 | 161 | 0.14 | -0.68 | -0.94 | -0.08 | -0.05 | 0.43 | 1.41 |
| 1 | 1 | 140 | 10 | 38 | 0.13 | -1.22 | -1.19 | -0.92 | -0.76 | 0.43 | 0.91 |
| 1 | 1 | 141 | 14 | 180 | 0.12 | 0.09 | 0.68 | -0.56 | -0.76 | -0.67 | 0.07 |
| 1 | 1 | 142 | 13 | 427 | 0.21 | 2.01 | 0.62 | 2.29 | 2.79 | 0.43 | -0.18 |
| 1 | 1 | 143 | 15 | 354 | 0.13 | 1.63 | 0.68 | 0.39 | 1.37 | -0.67 | 0.26 |
| 1 | 1 | 144 | 11 | 301 | 0.14 | 0.82 | -1.03 | 0.15 | 1.37 | -0.67 | -0.85 |
| 1 | 1 | 145 | 7 | 205 | 0.13 | 0.24 | -0.08 | -0.05 | -0.05 | -0.67 | 1.57 |
| 1 | 1 | 146 | 10 | 327 | 0.14 | 0.66 | 0.68 | 0.60 | 1.37 | -0.67 | 0.42 |
| 1 | 1 | 147 | 11 | 403 | 0.11 | 0.92 | 2.30 | 1.22 | 1.37 | 0.43 | -0.79 |
| 1 | 1 | 148 | 7 | 127 | 0.08 | -0.20 | -0.89 | -0.10 | -0.76 | -0.67 | 0.69 |
| 1 | 1 | 149 | 6 | 217 | 0.11 | 0.41 | -0.89 | 0.26 | -0.05 | -0.67 | -0.34 |
| 1 | 1 | 150 | 11 | 323 | 0.1 | 0.96 | -0.89 | 0.46 | 1.37 | 0.43 | 0.72 |
| 1 | 1 | 151 | 9 | 208 | 0.16 | 0.16 | 0.68 | -0.60 | -0.05 | -0.67 | 0.43 |
| 1 | 1 | 152 | 14 | 405 | 0.14 | -0.04 | 0.68 | 2.29 | 1.37 | 0.43 | -2.24 |
| 1 | 1 | 153 | 10 | 98 | 0.08 | -0.22 | -0.89 | -1.14 | -0.76 | -0.67 | 0.33 |
| 1 | 1 | 154 | 7 | 47 | 0.05 | -0.99 | -0.89 | -1.16 | -0.76 | -0.67 | 1.03 |
| 1 | 1 | 155 | 8 | 182 | 0.15 | -1.00 | 0.68 | 0.08 | -0.76 | -0.67 | -0.46 |
| 1 | 1 | 156 | 9 | 153 | 0.07 | -0.24 | -0.93 | 0.04 | -0.05 | -0.67 | 1.57 |
| 1 | 1 | 157 | 23 | 65 | 0.08 | -1.23 | -0.89 | -0.82 | -0.76 | -0.67 | 0.27 |
| 1 | 1 | 158 | 4 | 367 | 0.08 | 3.03 | -0.08 | 0.15 | -0.05 | -0.67 | -1.65 |
| 1 | 1 | 159 | 29 | 421 | 0.15 | 1.04 | 0.68 | 2.29 | 2.79 | 0.43 | -0.84 |
| 1 | 1 | 160 | 10 | 111 | 0.14 | -1.33 | -0.89 | 0.06 | -0.76 | -0.67 | -0.65 |
| 1 | 1 | 161 | 13 | 382 | 0.25 | 3.03 | 0.68 | 0.23 | -0.05 | -0.42 | -1.68 |
| 1 | 1 | 162 | 10 | 142 | 0.1 | -0.54 | -0.89 | -0.76 | -0.05 | -0.67 | -0.73 |
| 1 | 1 | 163 | 14 | 433 | 0.32 | 1.31 | -0.10 | 2.29 | 2.79 | 2.64 | -0.81 |
| 1 | 1 | 164 | 14 | 50 | 0.08 | -0.88 | -0.08 | -1.19 | -1.11 | -0.67 | 0.91 |
| 1 | 1 | 165 | 23 | 125 | 0.14 | -0.78 | 0.68 | -0.84 | -0.76 | -0.67 | 0.41 |
| 1 | 1 | 166 | 7 | 67 | 0.09 | -0.95 | 0.68 | -1.13 | -0.96 | -0.67 | 1.03 |
| 1 | 1 | 167 | 12 | 254 | 0.11 | -0.53 | 0.68 | 0.02 | -0.05 | 0.43 | -0.24 |
| 1 | 1 | 168 | 11 | 113 | 0.17 | -0.63 | -0.08 | -0.98 | -0.92 | -0.67 | -0.30 |
| 1 | 1 | 169 | 9 | 83 | 0.04 | -0.71 | -0.89 | -0.75 | -0.76 | -0.67 | 0.76 |
| 1 | 1 | 170 | 10 | 89 | 0.11 | -1.18 | -0.93 | -0.19 | -0.76 | -0.67 | 0.66 |
| 1 | 1 | 171 | 8 | 44 | 0.06 | -1.41 | -0.89 | -1.13 | -0.76 | -0.67 | 0.36 |
| 1 | 1 | 172 | 8 | 310 | 0.08 | 1.13 | -0.99 | 0.45 | 1.37 | -0.67 | -0.08 |
| 1 | 1 | 173 | 8 | 6 | 0.08 | -1.21 | -0.89 | -1.28 | -1.11 | -0.67 | -1.64 |
| 1 | 1 | 174 | 14 | 295 | 0.12 | -0.78 | -0.08 | 0.38 | -0.76 | 2.64 | -1.00 |
| 1 | 1 | 175 | 12 | 316 | 0.19 | 1.00 | -1.02 | -0.29 | 1.37 | -0.67 | -1.61 |
| 1 | 1 | 176 | 2 | 346 | 0.15 | 0.38 | -0.11 | 2.60 | -0.05 | 0.43 | 0.69 |
| 1 | 1 | 177 | 12 | 194 | 0.17 | -0.92 | 0.68 | -0.46 | -0.76 | 0.43 | -0.16 |
| 1 | 1 | 178 | 17 | 396 | 0.19 | 1.51 | 2.30 | 0.45 | 1.37 | -0.09 | 1.34 |
| 1 | 1 | 179 | 15 | 206 | 0.13 | -1.30 | -0.08 | 0.26 | -0.76 | 0.43 | -0.90 |
| 1 | 1 | 180 | 4 | 322 | 0.13 | 1.35 | -0.89 | -0.06 | 1.37 | 0.43 | 0.52 |
| 1 | 1 | 181 | 31 | 225 | 0.12 | -0.25 | 0.68 | 0.14 | -0.05 | -0.67 | 0.21 |
| 1 | 1 | 182 | 5 | 185 | 0.14 | 0.46 | -0.57 | -0.29 | -0.76 | 0.43 | 1.27 |
| 1 | 1 | 183 | 12 | 103 | 0.15 | -0.68 | -0.08 | -0.63 | -0.76 | -0.67 | 1.39 |
| 1 | 1 | 184 | 2 | 151 | 0.06 | -0.29 | -0.89 | 0.06 | -0.76 | -0.67 | -0.87 |
| 1 | 1 | 185 | 11 | 232 | 0.12 | -0.39 | 0.68 | 0.22 | -0.05 | -0.67 | -0.20 |
| 1 | 1 | 186 | 9 | 313 | 0.29 | 0.41 | -0.89 | 1.32 | 1.37 | -0.67 | 0.31 |
| 1 | 1 | 187 | 17 | 372 | 0.15 | 0.23 | 0.70 | 1.21 | 1.37 | 0.43 | -0.74 |
| 1 | 1 | 188 | 9 | 407 | 0.09 | 0.41 | 2.30 | 1.70 | 1.37 | 0.43 | -1.08 |
| 1 | 1 | 189 | 14 | 68 | 0.36 | -0.83 | 0.19 | -1.16 | -1.09 | 0.43 | 0.95 |
| 1 | 1 | 190 | 9 | 37 | 0.08 | -1.53 | -1.27 | -0.83 | -0.76 | -0.67 | 0.41 |
| 1 | 1 | 191 | 8 | 28 | 0.13 | -1.33 | -1.22 | -1.10 | -0.85 | -0.67 | -0.62 |
| 1 | 1 | 192 | 8 | 71 | 0.24 | -1.65 | -0.89 | -0.19 | -0.67 | -0.67 | 0.43 |
| 1 | 1 | 193 | 9 | 7 | 0.2 | -0.89 | -0.89 | -0.36 | -0.84 | 2.64 | 0.50 |
| 1 | 1 | 194 | 16 | 53 | 0.05 | -1.15 | -0.89 | -0.82 | -0.76 | -0.67 | 1.06 |
| 1 | 1 | 195 | 9 | 29 | 0.07 | -0.60 | -1.27 | -1.08 | -0.76 | -0.67 | -1.66 |
| 1 | 1 | 196 | 14 | 377 | 0.21 | 2.10 | 0.68 | 0.25 | 1.37 | 0.43 | 0.77 |
| 1 | 1 | 197 | 10 | 190 | 0.2 | -0.45 | -0.73 | -0.74 | -0.05 | 0.43 | -0.77 |
| 1 | 1 | 198 | 11 | 15 | 0.13 | -1.32 | -1.03 | -1.19 | -1.11 | 0.43 | 0.88 |
| 1 | 1 | 199 | 13 | 344 | 0.23 | 0.11 | 0.68 | 0.14 | -0.05 | 2.64 | -0.39 |
| 1 | 1 | 200 | 12 | 128 | 0.1 | 0.05 | -0.08 | -0.81 | -0.76 | -0.67 | 0.90 |
| 1 | 1 | 201 | 17 | 36 | 0.06 | -1.25 | -1.27 | -1.13 | -0.76 | -0.67 | 0.40 |
| 1 | 1 | 202 | 12 | 172 | 0.1 | -0.33 | -0.89 | -0.44 | -0.05 | 0.43 | 0.94 |
| 1 | 1 | 203 | 8 | 293 | 0.04 | -0.74 | 1.11 | 0.50 | -0.05 | 0.43 | -1.22 |
| 1 | 1 | 204 | 4 | 165 | 0.06 | 0.02 | -0.89 | -0.67 | -0.76 | 0.43 | -1.69 |
| 1 | 1 | 205 | 13 | 187 | 0.13 | 0.43 | -0.89 | -0.71 | -0.05 | -0.67 | -1.63 |
| 1 | 1 | 206 | 4 | 210 | 0.13 | 0.00 | 2.30 | -0.78 | -0.76 | 0.43 | 0.94 |
| 1 | 1 | 207 | 12 | 70 | 0.05 | -0.72 | -1.27 | -0.81 | -0.76 | -0.67 | 0.69 |
| 1 | 1 | 208 | 13 | 156 | 0.14 | 0.14 | -0.08 | -0.21 | -0.76 | -0.67 | 0.93 |
| 1 | 1 | 209 | 10 | 317 | 0.17 | 0.87 | 2.30 | 0.21 | -0.05 | -0.67 | 1.31 |
| 1 | 1 | 210 | 19 | 197 | 0.23 | -0.12 | 0.68 | -0.79 | -0.83 | 0.43 | 0.37 |
| 1 | 1 | 211 | 13 | 201 | 0.15 | 0.33 | -0.98 | -0.29 | -0.05 | -0.67 | -0.85 |
| 1 | 1 | 212 | 12 | 94 | 0.15 | -1.25 | -0.08 | -0.80 | -0.76 | -0.67 | 0.38 |
| 1 | 1 | 213 | 26 | 85 | 0.09 | -0.35 | -0.89 | -0.98 | -0.76 | -0.67 | -1.63 |
| 1 | 1 | 214 | 13 | 324 | 0.11 | 0.51 | 2.30 | 0.21 | -0.05 | 0.43 | 0.66 |
| 1 | 1 | 215 | 9 | 95 | 0.08 | -0.54 | 0.68 | -1.18 | -1.07 | -0.67 | 1.01 |
| 1 | 1 | 216 | 23 | 126 | 0.14 | -0.21 | -0.08 | -1.01 | -0.77 | -0.67 | -1.64 |
| 1 | 1 | 217 | 10 | 256 | 0.2 | 0.88 | 0.68 | -0.39 | -0.05 | -0.67 | -0.79 |
| 1 | 1 | 218 | 19 | 356 | 0.34 | -0.96 | 0.52 | 1.97 | -0.20 | -0.67 | -2.18 |
| 1 | 1 | 219 | 14 | 147 | 0.11 | 0.17 | -0.08 | -0.77 | -0.76 | -0.67 | 0.27 |
| 1 | 1 | 220 | 14 | 240 | 0.2 | 0.44 | 0.86 | 0.03 | -0.10 | -0.67 | 1.57 |
| 1 | 1 | 221 | 13 | 191 | 0.18 | 0.14 | -0.08 | -0.75 | -0.49 | 0.43 | 0.67 |
| 1 | 1 | 222 | 7 | 380 | 0.05 | 0.04 | -0.08 | 1.70 | 1.37 | 0.43 | -1.23 |
| 1 | 1 | 223 | 18 | 88 | 0.05 | -0.46 | -0.89 | -0.77 | -0.76 | -0.67 | 1.00 |
| 1 | 1 | 224 | 12 | 417 | 0.1 | 1.69 | 0.68 | 2.29 | 2.79 | -0.67 | 0.38 |
| 1 | 1 | 225 | 18 | 413 | 0.27 | 2.56 | 0.68 | -0.04 | 1.37 | 2.64 | 0.59 |
| 1 | 1 | 226 | 9 | 401 | 0.08 | 0.81 | -0.89 | 1.20 | 1.37 | 2.64 | -0.61 |
| 1 | 1 | 227 | 8 | 49 | 0.08 | -1.28 | -0.89 | -0.29 | -0.76 | -0.67 | 1.41 |
| 1 | 1 | 228 | 8 | 12 | 0.18 | -1.32 | -1.03 | -0.93 | -0.85 | 0.43 | 1.57 |
| 1 | 1 | 229 | 6 | 237 | 0.19 | 0.49 | 0.17 | -0.73 | -0.76 | 0.43 | -1.52 |
| 1 | 1 | 230 | 16 | 266 | 0.12 | -0.03 | 0.68 | 0.49 | -0.05 | 0.43 | 0.54 |
| 1 | 1 | 231 | 5 | 82 | 0.12 | -1.25 | -1.12 | -0.29 | -0.05 | -0.67 | 1.27 |
| 1 | 1 | 232 | 24 | 110 | 0.12 | -0.42 | -1.00 | -0.71 | -0.76 | -0.67 | -0.07 |
| 1 | 1 | 233 | 14 | 270 | 0.15 | 0.34 | -1.11 | 0.45 | 1.37 | -0.67 | 1.39 |
| 1 | 1 | 234 | 4 | 200 | 0.07 | -0.30 | -0.89 | 0.26 | -0.05 | 0.43 | 1.57 |
| 1 | 1 | 235 | 16 | 263 | 0.29 | 0.70 | 0.35 | -0.52 | -0.18 | 0.43 | -0.62 |
| 1 | 1 | 236 | 12 | 280 | 0.21 | 1.35 | 0.68 | -0.41 | -0.23 | 0.43 | 0.90 |
| 1 | 1 | 237 | 8 | 281 | 0.07 | -0.96 | 0.68 | 0.48 | -0.05 | 0.43 | -1.23 |
| 1 | 1 | 238 | 7 | 287 | 0.13 | 0.20 | 0.68 | 0.45 | -0.05 | 0.43 | -0.55 |
| 1 | 1 | 239 | 16 | 116 | 0.18 | -1.23 | -0.08 | -0.57 | -0.76 | -0.67 | -0.37 |
| 1 | 1 | 240 | 31 | 321 | 0.42 | -0.50 | 2.30 | 0.40 | -0.12 | -0.03 | -1.00 |
| 1 | 1 | 241 | 16 | 420 | 0.13 | 1.51 | -0.08 | 2.29 | 2.79 | 0.43 | -0.49 |
| 1 | 1 | 242 | 8 | 199 | 0.12 | 0.05 | -1.03 | 0.20 | -0.05 | -0.67 | -0.07 |
| 1 | 1 | 243 | 8 | 314 | 0.07 | 0.99 | -0.08 | 0.45 | 1.37 | -0.67 | 0.72 |
| 1 | 1 | 244 | 7 | 289 | 0.06 | -0.20 | 0.68 | 1.23 | -0.05 | 0.43 | 0.35 |
| 1 | 1 | 245 | 15 | 216 | 0.16 | -0.16 | 0.68 | 0.03 | -0.05 | -0.67 | 0.76 |
| 1 | 1 | 246 | 14 | 273 | 0.16 | 0.31 | 0.63 | 0.14 | -0.05 | 0.43 | 0.07 |
| 1 | 1 | 247 | 20 | 132 | 0.18 | -0.76 | -0.08 | -0.14 | -0.72 | -0.67 | 0.74 |
| 1 | 1 | 248 | 5 | 369 | 0.08 | -0.74 | 0.68 | 1.70 | -0.05 | 0.43 | -2.35 |
| 1 | 1 | 249 | 11 | 432 | 0.33 | 0.40 | 1.35 | 2.13 | 1.37 | 2.64 | -2.28 |
| 1 | 1 | 250 | 6 | 341 | 0.22 | 0.46 | 0.62 | 0.21 | 1.37 | 0.43 | 1.36 |
| 1 | 1 | 251 | 24 | 404 | 0.12 | 1.30 | -0.89 | 2.29 | 2.79 | -0.67 | 0.33 |
| 1 | 1 | 252 | 9 | 63 | 0.05 | -1.23 | -0.89 | -0.75 | -0.76 | -0.67 | 0.65 |
| 1 | 1 | 253 | 11 | 226 | 0.16 | 0.32 | -0.96 | -0.44 | -0.05 | 0.43 | -0.89 |
| 1 | 1 | 254 | 11 | 133 | 0.09 | -0.43 | -0.89 | -0.75 | -0.05 | -0.67 | 0.33 |
| 1 | 1 | 255 | 19 | 376 | 0.16 | 0.60 | 0.68 | 1.17 | 1.37 | 0.43 | -0.70 |
| 1 | 1 | 256 | 8 | 80 | 0.06 | -0.23 | -0.89 | -1.15 | -0.76 | -0.67 | 0.91 |
| 1 | 1 | 257 | 13 | 164 | 0.27 | -0.58 | 0.15 | -0.89 | -0.87 | 0.43 | -0.58 |
| 1 | 1 | 258 | 7 | 78 | 0.07 | -0.82 | -0.89 | -0.29 | -0.76 | -0.67 | 1.33 |
| 1 | 1 | 259 | 5 | 422 | 0.05 | 1.83 | -0.89 | 2.29 | 2.79 | 0.43 | -0.23 |
| 1 | 1 | 260 | 10 | 23 | 0.11 | -1.45 | -1.23 | -1.16 | -0.83 | -0.67 | -0.10 |
| 1 | 1 | 261 | 8 | 60 | 0.05 | -0.57 | -1.27 | -1.14 | -0.76 | -0.67 | 0.44 |
| 1 | 1 | 262 | 12 | 93 | 0.12 | -0.88 | -0.92 | -0.78 | -0.76 | -0.67 | -0.41 |
| 1 | 1 | 263 | 20 | 279 | 0.24 | 0.33 | 2.30 | 0.14 | -0.19 | -0.67 | 0.31 |
| 1 | 1 | 264 | 15 | 75 | 0.05 | -0.67 | -0.89 | -1.10 | -0.76 | -0.67 | 0.39 |
| 1 | 1 | 265 | 15 | 351 | 0.04 | 0.27 | -0.89 | 1.20 | 1.37 | 0.43 | -0.45 |
| 1 | 1 | 266 | 7 | 408 | 0.25 | 2.71 | -0.89 | -0.27 | 1.37 | 2.64 | 0.06 |
| 1 | 1 | 267 | 10 | 57 | 0.07 | -0.55 | -1.27 | -1.15 | -0.76 | -0.67 | 0.82 |
| 1 | 1 | 268 | 13 | 387 | 0.24 | 0.11 | 0.68 | 1.48 | -0.05 | 2.64 | -0.61 |
| 1 | 1 | 269 | 6 | 436 | 0.27 | 1.23 | 2.30 | 1.91 | 2.08 | 2.64 | -1.04 |
| 1 | 1 | 270 | 11 | 202 | 0.09 | 0.20 | -0.89 | 0.47 | -0.05 | -0.67 | 0.68 |
| 1 | 1 | 271 | 12 | 402 | 0.47 | 1.54 | 0.81 | 2.42 | 1.37 | 0.34 | 0.54 |
| 1 | 1 | 272 | 12 | 152 | 0.15 | -0.05 | -0.08 | -0.83 | -0.76 | -0.67 | -0.87 |
| 1 | 1 | 273 | 9 | 21 | 0.09 | -1.48 | -0.89 | -1.15 | -0.96 | -0.67 | 0.95 |
| 1 | 1 | 274 | 11 | 69 | 0.05 | -0.74 | -0.89 | -0.84 | -0.76 | -0.67 | 1.11 |
| 1 | 1 | 275 | 14 | 249 | 0.24 | -0.06 | -0.77 | -0.01 | -0.15 | 2.64 | 0.62 |
| 1 | 1 | 276 | 4 | 318 | 0.08 | -1.02 | 0.68 | 1.35 | -0.05 | 0.43 | -1.21 |
| 1 | 1 | 277 | 19 | 247 | 0.15 | 0.09 | 0.68 | -0.36 | -0.05 | 0.43 | 0.65 |
| 1 | 1 | 278 | 14 | 297 | 0.17 | 1.23 | 0.68 | 0.23 | -0.05 | 0.43 | 0.58 |
| 1 | 1 | 279 | 8 | 338 | 0.08 | 0.88 | 0.68 | 0.45 | 1.37 | -0.67 | 1.41 |
| 1 | 1 | 280 | 13 | 171 | 0.1 | -0.38 | -0.92 | 0.00 | -0.05 | -0.67 | 0.06 |
| 1 | 1 | 281 | 14 | 371 | 0.28 | -0.93 | 2.30 | 0.46 | -0.20 | -0.12 | -2.31 |
| 1 | 1 | 282 | 24 | 173 | 0.33 | -0.28 | 0.73 | -0.75 | -0.82 | 0.43 | 0.97 |
| 1 | 1 | 283 | 18 | 355 | 0.14 | 1.38 | 0.68 | -0.21 | 1.37 | 0.43 | 0.42 |
| 1 | 1 | 284 | 4 | 345 | 0.11 | 0.83 | -0.49 | -0.66 | 1.37 | 0.43 | -1.59 |
| 1 | 1 | 285 | 11 | 56 | 0.05 | -0.83 | -1.27 | -0.78 | -0.76 | -0.67 | 1.01 |
| 1 | 1 | 286 | 7 | 370 | 0.19 | 1.55 | 0.68 | -0.38 | 1.37 | 0.43 | -1.02 |
| 1 | 1 | 287 | 10 | 64 | 0.06 | -0.48 | -0.89 | -1.19 | -1.11 | -0.67 | 0.41 |
| 1 | 1 | 288 | 9 | 167 | 0.15 | -0.36 | -0.89 | -0.71 | -0.05 | 0.43 | 0.29 |
| 1 | 1 | 289 | 6 | 124 | 0.15 | -1.28 | -1.02 | -0.29 | -0.05 | -0.67 | -0.30 |
| 1 | 1 | 290 | 14 | 272 | 0.11 | -0.70 | 0.68 | 0.43 | -0.05 | 0.43 | -0.78 |
| 1 | 1 | 291 | 25 | 188 | 0.09 | 0.21 | -0.89 | 0.05 | -0.05 | -0.67 | 0.52 |
| 1 | 1 | 292 | 23 | 435 | 0.3 | 1.05 | 2.30 | 2.59 | 2.79 | 0.43 | -1.56 |
| 1 | 1 | 293 | 8 | 8 | 0.08 | -1.66 | -1.27 | -1.18 | -0.94 | -0.67 | 1.07 |
| 1 | 1 | 294 | 14 | 431 | 0.2 | 2.14 | 2.30 | 2.29 | 2.79 | -0.60 | 0.40 |
| 1 | 1 | 295 | 3 | 96 | 0.08 | -0.16 | -0.89 | -0.58 | -0.76 | -0.67 | 1.57 |
| 1 | 1 | 296 | 9 | 87 | 0.06 | -0.46 | -1.27 | -0.78 | -0.76 | -0.67 | 0.39 |
| 1 | 1 | 297 | 6 | 414 | 0.04 | 0.90 | -0.89 | 2.29 | 2.79 | 0.43 | -0.85 |
| 1 | 1 | 298 | 10 | 230 | 0.18 | 0.31 | -0.89 | 0.32 | -0.12 | 0.43 | 0.71 |
| 1 | 1 | 299 | 13 | 58 | 0.11 | -0.84 | -0.95 | -1.16 | -0.98 | -0.67 | -0.08 |
| 1 | 1 | 300 | 11 | 4 | 0.31 | 0.63 | -1.03 | -0.85 | -0.76 | 2.64 | 0.96 |
| 1 | 1 | 301 | 24 | 277 | 0.18 | 0.59 | 0.68 | 0.21 | -0.08 | 0.43 | 0.76 |
| 1 | 1 | 302 | 13 | 130 | 0.1 | -0.85 | -0.08 | -0.61 | -0.76 | -0.67 | -0.80 |
| 1 | 1 | 303 | 6 | 131 | 0.17 | -0.53 | 0.68 | -0.61 | -0.52 | -0.67 | 1.57 |
| 1 | 1 | 304 | 10 | 55 | 0.09 | -1.09 | -1.27 | -0.81 | -0.76 | -0.67 | 0.53 |
| 1 | 1 | 305 | 21 | 269 | 0.11 | -1.16 | 0.68 | 0.48 | -0.05 | -0.67 | -2.25 |
| 1 | 1 | 306 | 21 | 91 | 0.1 | -0.45 | -0.08 | -1.19 | -1.11 | -0.67 | 0.57 |
| 1 | 1 | 307 | 9 | 104 | 0.08 | -0.46 | -0.98 | -0.87 | -0.76 | -0.67 | -1.12 |
| 1 | 1 | 308 | 19 | 32 | 0.07 | -1.02 | -0.89 | -1.19 | -1.11 | -0.67 | 0.97 |
| 1 | 1 | 309 | 4 | 268 | 0.13 | 0.22 | -0.89 | 0.07 | -0.05 | 2.64 | 1.46 |
| 1 | 1 | 310 | 4 | 59 | 0.03 | -0.93 | -0.89 | -1.14 | -0.76 | -0.67 | 0.69 |
| 1 | 1 | 311 | 4 | 2 | 0.12 | -0.52 | -0.89 | -0.66 | -0.76 | 2.64 | 1.22 |
| 1 | 1 | 312 | 8 | 3 | 0.15 | -1.66 | -1.22 | -1.08 | -0.89 | -0.67 | 1.57 |
| 1 | 1 | 313 | 8 | 141 | 0.14 | -0.61 | -0.89 | -0.21 | -0.76 | 0.43 | 0.62 |
| 1 | 1 | 314 | 14 | 183 | 0.08 | -0.35 | -0.89 | 0.45 | -0.05 | -0.67 | 0.46 |
| 1 | 1 | 315 | 6 | 192 | 0.15 | 0.62 | -0.89 | -0.58 | -0.76 | 0.43 | -0.26 |
| 1 | 1 | 316 | 6 | 196 | 0.17 | -0.81 | -0.76 | -0.07 | -0.05 | 0.43 | 0.02 |
| 1 | 1 | 317 | 15 | 92 | 0.04 | -0.75 | -0.89 | -0.75 | -0.76 | -0.67 | 0.36 |
| 1 | 1 | 318 | 6 | 214 | 0.21 | 1.06 | 0.17 | -0.90 | -0.64 | -0.67 | -1.66 |
| 1 | 1 | 319 | 9 | 389 | 0.05 | 0.14 | 0.68 | 1.70 | 1.37 | 0.43 | -1.23 |
| 1 | 1 | 320 | 8 | 381 | 0.26 | -0.31 | 2.30 | 1.86 | -0.05 | -0.12 | -0.99 |
| 1 | 1 | 321 | 2 | 438 | 0.38 | 2.15 | 1.49 | 6.57 | 1.37 | 0.43 | -0.28 |
| 1 | 1 | 322 | 18 | 224 | 0.09 | -0.62 | 0.68 | 0.03 | -0.05 | -0.67 | -0.65 |
| 1 | 1 | 323 | 7 | 361 | 0.05 | 0.63 | -0.89 | 1.20 | 1.37 | 0.43 | -0.72 |
| 1 | 1 | 324 | 8 | 352 | 0.12 | 0.50 | -0.89 | 1.20 | 1.37 | 0.43 | -0.08 |
| 1 | 1 | 325 | 8 | 234 | 0.1 | -0.43 | 0.68 | -0.12 | -0.05 | 0.43 | 1.46 |
| 1 | 1 | 326 | 4 | 17 | 0.04 | -1.04 | -0.08 | -1.19 | -1.11 | -0.67 | 1.57 |
| 1 | 1 | 327 | 3 | 5 | 0.04 | -2.07 | -0.89 | -1.18 | -1.11 | -0.67 | 0.90 |
| 1 | 1 | 328 | 9 | 304 | 0.1 | -0.22 | 0.68 | 1.23 | -0.05 | 0.43 | -0.46 |
| 1 | 1 | 329 | 9 | 350 | 0.09 | -1.17 | 2.30 | 0.45 | -0.76 | -0.67 | -2.26 |
| 1 | 1 | 330 | 12 | 294 | 0.1 | 0.98 | -0.92 | 0.45 | 1.37 | -0.67 | 0.42 |
| 1 | 1 | 331 | 9 | 307 | 0.16 | 0.42 | -1.14 | 0.45 | 1.37 | 0.43 | 1.02 |
| 1 | 1 | 332 | 9 | 439 | 0.49 | 1.48 | 1.28 | 4.62 | 3.89 | 0.92 | -2.46 |
| 1 | 1 | 333 | 10 | 148 | 0.16 | -1.03 | -0.57 | -0.22 | -0.76 | 0.43 | -0.12 |
| 1 | 1 | 334 | 6 | 424 | 0.25 | 1.52 | 2.30 | 0.32 | 1.37 | 2.64 | 0.77 |
| 1 | 1 | 335 | 19 | 52 | 0.13 | -1.61 | -0.91 | -0.75 | -0.78 | -0.67 | -0.31 |
| 1 | 1 | 336 | 12 | 31 | 0.05 | -1.28 | -0.89 | -1.19 | -1.11 | -0.67 | 0.60 |
| 1 | 1 | 337 | 19 | 123 | 0.2 | -1.52 | -0.08 | -0.20 | -0.76 | -0.67 | -0.94 |
| 1 | 1 | 338 | 11 | 218 | 0.17 | -1.37 | 0.54 | 0.45 | -0.76 | -0.67 | -2.18 |
| 1 | 1 | 339 | 18 | 262 | 0.2 | 0.88 | 0.68 | -0.51 | -0.17 | 0.43 | 0.51 |
| 1 | 1 | 340 | 3 | 325 | 0.07 | -1.02 | 0.68 | 0.58 | -0.05 | 0.43 | -2.46 |
| 1 | 1 | 341 | 7 | 212 | 0.1 | -1.39 | 0.68 | -0.26 | -0.76 | 0.43 | -1.09 |
| 1 | 1 | 342 | 12 | 41 | 0.1 | -1.75 | -0.92 | -0.62 | -0.76 | -0.67 | -0.88 |
| 1 | 1 | 343 | 7 | 290 | 0.08 | 0.87 | -1.27 | 0.46 | 1.37 | -0.67 | 0.86 |
| 1 | 1 | 344 | 8 | 139 | 0.12 | -1.50 | -0.89 | 0.01 | -0.76 | 0.43 | -0.63 |
| 1 | 1 | 345 | 14 | 360 | 0.27 | 2.68 | 0.68 | 0.44 | -0.25 | 0.43 | 0.22 |
| 1 | 1 | 346 | 6 | 181 | 0.05 | -0.41 | -0.89 | -0.29 | -0.05 | 0.43 | 0.55 |
| 1 | 1 | 347 | 13 | 353 | 0.11 | 1.54 | 0.68 | 0.47 | 1.37 | -0.67 | 0.87 |
| 1 | 1 | 348 | 8 | 349 | 0.13 | 1.34 | 0.30 | 0.15 | 1.37 | -0.67 | -0.87 |
| 1 | 1 | 349 | 6 | 61 | 0.04 | -0.65 | -0.89 | -1.14 | -0.76 | -0.67 | 0.94 |
| 1 | 1 | 350 | 21 | 336 | 0.25 | 0.45 | 0.68 | 0.04 | -0.12 | 2.64 | 0.86 |
| 1 | 1 | 351 | 12 | 235 | 0.26 | -0.42 | 0.36 | -0.39 | -0.76 | 2.64 | 0.12 |
| 1 | 1 | 352 | 10 | 115 | 0.16 | -1.06 | -1.08 | -0.38 | -0.05 | -0.67 | 0.60 |
| 1 | 1 | 353 | 5 | 339 | 0.15 | -0.28 | 0.22 | 2.29 | -0.05 | 0.43 | -0.55 |
| 1 | 1 | 354 | 16 | 397 | 0.34 | -0.72 | 2.30 | 1.86 | -0.14 | -0.26 | -2.30 |
| 1 | 1 | 355 | 4 | 236 | 0.15 | -0.45 | 0.68 | -0.08 | -0.05 | -0.67 | -1.43 |
| 1 | 1 | 356 | 11 | 160 | 0.1 | -0.31 | -1.27 | 0.12 | -0.05 | -0.67 | 0.58 |
| 1 | 1 | 357 | 15 | 10 | 0.12 | -1.28 | -1.27 | -1.05 | -0.83 | -0.67 | -1.64 |
| 1 | 1 | 358 | 4 | 392 | 0.2 | 0.97 | 2.30 | 0.54 | 1.37 | 0.43 | 0.28 |
| 1 | 1 | 359 | 10 | 144 | 0.11 | 0.46 | -0.89 | -0.29 | -0.76 | -0.67 | 0.75 |
| 1 | 1 | 360 | 7 | 213 | 0.07 | -0.32 | -0.89 | 0.48 | -0.05 | 0.43 | 0.59 |
| 1 | 1 | 361 | 9 | 394 | 0.28 | 2.75 | 0.68 | -0.09 | -0.29 | 2.64 | 0.44 |
| 1 | 1 | 362 | 7 | 340 | 0.2 | 0.97 | -1.00 | 0.03 | 1.37 | 0.43 | -0.83 |
| 1 | 1 | 363 | 21 | 251 | 0.24 | 1.15 | 0.68 | -0.21 | -0.12 | -0.67 | 0.79 |
| 1 | 1 | 364 | 11 | 118 | 0.1 | 0.14 | -0.89 | -0.84 | -0.76 | -0.67 | 0.42 |
| 1 | 1 | 365 | 12 | 99 | 0.04 | -0.35 | -0.89 | -0.79 | -0.76 | -0.67 | 0.76 |
| 1 | 1 | 366 | 8 | 388 | 0.15 | 1.15 | 2.30 | 0.73 | 1.37 | -0.67 | 0.52 |
| 1 | 1 | 367 | 8 | 260 | 0.14 | 0.41 | 0.68 | 0.40 | -0.05 | -0.67 | -0.36 |
| 1 | 1 | 368 | 12 | 117 | 0.09 | -0.93 | -0.89 | -0.04 | -0.76 | -0.67 | 0.01 |
| 1 | 1 | 369 | 12 | 303 | 0.05 | -0.52 | 0.68 | 1.23 | -0.05 | 0.43 | -0.83 |
| 1 | 1 | 370 | 7 | 395 | 0.21 | 1.06 | 2.30 | 0.16 | -0.05 | 2.64 | 0.75 |
| 1 | 1 | 371 | 17 | 81 | 0.13 | -1.49 | -0.91 | -0.30 | -0.76 | -0.67 | -0.20 |
| 1 | 1 | 372 | 10 | 209 | 0.28 | 0.96 | 0.60 | -0.64 | -0.76 | -0.67 | -0.04 |
| 1 | 1 | 373 | 17 | 66 | 0.08 | -1.33 | -0.89 | -0.81 | -0.76 | -0.67 | -0.51 |
| 1 | 1 | 374 | 14 | 136 | 0.12 | -0.28 | -0.92 | -0.50 | -0.05 | -0.67 | 1.05 |
| 1 | 1 | 375 | 20 | 198 | 0.09 | -0.42 | -0.89 | 0.45 | -0.05 | -0.67 | -0.41 |
| 1 | 1 | 376 | 16 | 26 | 0.08 | -1.40 | -1.27 | -1.15 | -0.76 | -0.67 | 0.81 |
| 1 | 1 | 377 | 7 | 343 | 0.22 | 0.87 | -0.89 | -0.65 | -0.35 | 2.64 | -1.45 |
| 1 | 1 | 378 | 10 | 39 | 0.06 | -1.04 | -1.27 | -1.11 | -0.76 | -0.67 | 0.91 |
| 1 | 1 | 379 | 8 | 229 | 0.09 | 0.17 | 0.68 | -0.63 | -0.05 | -0.67 | -0.85 |
| 1 | 1 | 380 | 15 | 175 | 0.14 | 0.41 | -0.89 | -0.46 | -0.05 | -0.67 | 0.41 |
| 1 | 1 | 381 | 13 | 35 | 0.07 | -1.28 | -1.27 | -0.82 | -0.76 | -0.67 | 1.03 |
| 1 | 1 | 382 | 14 | 220 | 0.09 | -0.81 | -0.08 | 0.45 | -0.05 | -0.67 | -0.96 |
| 1 | 1 | 383 | 11 | 283 | 0.12 | 0.38 | 2.30 | -0.05 | -0.05 | -0.67 | 1.46 |
| 1 | 1 | 384 | 10 | 134 | 0.12 | -0.09 | 1.02 | -0.78 | -0.76 | -0.67 | 1.36 |
| 1 | 1 | 385 | 14 | 416 | 0.08 | 1.33 | -0.89 | 2.29 | 2.79 | 0.43 | -0.30 |
| 1 | 1 | 386 | 15 | 76 | 0.18 | -1.35 | -0.94 | -0.63 | -0.81 | 0.43 | 0.43 |
| 1 | 1 | 387 | 23 | 157 | 0.08 | -0.37 | -0.89 | -0.04 | -0.05 | -0.67 | 0.67 |
| 1 | 1 | 388 | 4 | 391 | 0.26 | 1.95 | 0.49 | 0.42 | -0.05 | 2.64 | 0.27 |
| 1 | 1 | 389 | 6 | 105 | 0.08 | -0.68 | 0.68 | -1.25 | -1.11 | -0.67 | -1.44 |
| 1 | 1 | 390 | 8 | 315 | 0.16 | 0.60 | -1.08 | 0.37 | 1.37 | 0.43 | 0.25 |
| 1 | 1 | 391 | 14 | 72 | 0.04 | -0.78 | -0.89 | -0.76 | -0.76 | -0.67 | 0.96 |
| 1 | 1 | 392 | 8 | 259 | 0.15 | 0.35 | 0.79 | -0.12 | -0.05 | 0.43 | 1.28 |
| 1 | 1 | 393 | 18 | 222 | 0.1 | -0.38 | -0.08 | 0.45 | -0.05 | -0.67 | -0.53 |
| 1 | 1 | 394 | 11 | 333 | 0.06 | 1.49 | -0.08 | 0.46 | 1.37 | -0.67 | 0.89 |
| 1 | 1 | 395 | 15 | 276 | 0.1 | -0.28 | 0.68 | 0.32 | -0.05 | 0.43 | -0.74 |
| 1 | 1 | 396 | 9 | 16 | 0.11 | -1.53 | -1.27 | -1.19 | -1.03 | -0.67 | 0.53 |
| 1 | 1 | 397 | 9 | 364 | 0.08 | 0.27 | -0.08 | 1.20 | 1.37 | 0.43 | -0.75 |
| 1 | 1 | 398 | 10 | 211 | 0.14 | 0.00 | -1.00 | -0.05 | -0.05 | 0.43 | 0.18 |
| 1 | 1 | 399 | 7 | 384 | 0.25 | 0.30 | 0.74 | 1.76 | 1.37 | 0.43 | -0.80 |
| 1 | 1 | 400 | 6 | 177 | 0.14 | 1.15 | -0.08 | -0.61 | -0.76 | -0.67 | 0.67 |
| 1 | 1 | 401 | 19 | 203 | 0.11 | -0.13 | -1.01 | 0.05 | -0.05 | 0.43 | 0.76 |
| 1 | 1 | 402 | 8 | 312 | 0.09 | 0.73 | -0.08 | 0.45 | 1.37 | -0.67 | 1.41 |
| 1 | 1 | 403 | 9 | 258 | 0.2 | -0.18 | -0.89 | 2.29 | -0.05 | -0.67 | -0.38 |
| 1 | 1 | 404 | 11 | 244 | 0.16 | -1.09 | 0.68 | 0.39 | -0.76 | 0.43 | -0.94 |
| 1 | 1 | 405 | 11 | 319 | 0.21 | 0.03 | 0.54 | 2.29 | -0.05 | -0.67 | -0.36 |
| 1 | 1 | 406 | 15 | 159 | 0.08 | -0.10 | -1.27 | 0.08 | -0.05 | -0.67 | 0.91 |
| 1 | 1 | 407 | 33 | 221 | 0.14 | 0.64 | -0.08 | -0.05 | -0.05 | -0.67 | 0.68 |
| 1 | 1 | 408 | 11 | 112 | 0.09 | -0.50 | 0.68 | -1.18 | -1.08 | -0.67 | 0.48 |
| 1 | 1 | 409 | 17 | 399 | 0.31 | 1.42 | 0.73 | 0.26 | 1.37 | 2.64 | 0.75 |
| 1 | 1 | 410 | 5 | 20 | 0.08 | -1.28 | -1.27 | -1.05 | -0.90 | -0.67 | -1.12 |
| 1 | 1 | 411 | 6 | 34 | 0.08 | -0.85 | -0.08 | -1.25 | -1.11 | -0.67 | -1.44 |
| 1 | 1 | 412 | 6 | 426 | 0.2 | 3.36 | 0.68 | -0.27 | 1.37 | 2.64 | -0.22 |
| 1 | 1 | 413 | 12 | 285 | 0.1 | 0.72 | -1.24 | 0.30 | 1.37 | -0.67 | 0.32 |
| 1 | 1 | 414 | 8 | 363 | 0.08 | 1.28 | 1.11 | 0.45 | 1.37 | -0.67 | 1.36 |
| 1 | 1 | 415 | 8 | 437 | 0.27 | 1.02 | 1.00 | 3.07 | 2.79 | 2.64 | -2.30 |
| 1 | 1 | 416 | 7 | 284 | 0.19 | 0.20 | -0.66 | -0.74 | -0.05 | 2.64 | -0.71 |
| 1 | 1 | 417 | 9 | 264 | 0.06 | 0.29 | -0.89 | 0.45 | 1.37 | -0.67 | 0.66 |
| 1 | 1 | 418 | 21 | 238 | 0.21 | 0.24 | -0.08 | -0.09 | -0.05 | 0.43 | 0.69 |
| 1 | 1 | 419 | 3 | 296 | 0.02 | -0.47 | 0.68 | 0.03 | -0.05 | 0.43 | -2.34 |
| 1 | 1 | 420 | 10 | 100 | 0.16 | -0.44 | -0.97 | -0.87 | -0.76 | 0.43 | 1.21 |
| 1 | 1 | 421 | 11 | 174 | 0.15 | -0.61 | 0.68 | -0.02 | -0.76 | -0.67 | 0.17 |
| 1 | 1 | 422 | 9 | 311 | 0.25 | 1.52 | -0.53 | -0.55 | -0.76 | 2.64 | 0.36 |
| 1 | 1 | 423 | 6 | 135 | 0.09 | -0.58 | -1.27 | 0.12 | -0.05 | -0.67 | 1.29 |
| 1 | 1 | 424 | 5 | 368 | 0.15 | 2.37 | 0.68 | 0.36 | 1.37 | -0.67 | 0.68 |
| 1 | 1 | 425 | 9 | 373 | 0.12 | 1.65 | 0.68 | 0.39 | 1.37 | 0.43 | -0.01 |
| 1 | 1 | 426 | 10 | 228 | 0.13 | 0.42 | -0.08 | -0.33 | -0.05 | -0.67 | -0.92 |
| 1 | 1 | 427 | 20 | 86 | 0.22 | -1.23 | -0.93 | -0.85 | -0.83 | 0.43 | -0.46 |
| 1 | 1 | 428 | 7 | 106 | 0.06 | -0.07 | -0.89 | -0.76 | -0.76 | -0.67 | 0.98 |
| 1 | 1 | 429 | 8 | 138 | 0.12 | -0.45 | -0.99 | -0.78 | -0.76 | 0.43 | -0.77 |
| 1 | 1 | 430 | 8 | 30 | 0.12 | -0.51 | 1.05 | -1.19 | -1.11 | -0.67 | 1.36 |
| 1 | 1 | 431 | 19 | 357 | 0.16 | 1.03 | 0.68 | -0.27 | -0.05 | 2.64 | 0.61 |
| 1 | 1 | 432 | 20 | 406 | 0.38 | 0.02 | 2.30 | 0.99 | -0.05 | 2.64 | -1.12 |
| 1 | 1 | 433 | 10 | 302 | 0.26 | 1.83 | 0.53 | -0.09 | -0.33 | 0.43 | 0.24 |
| 1 | 1 | 434 | 5 | 73 | 0.04 | -0.72 | -0.89 | -0.95 | -0.76 | -0.67 | 0.82 |
| 1 | 1 | 435 | 1 | 440 | 0 | 5.51 | 0.68 | 3.07 | 4.20 | 2.64 | 0.50 |
| 1 | 1 | 436 | 14 | 300 | 0.29 | 2.04 | 0.68 | 0.38 | -0.15 | -0.67 | 0.69 |
| 1 | 1 | 437 | 19 | 74 | 0.1 | -1.10 | -0.89 | -0.81 | -0.76 | -0.67 | -0.90 |
| 1 | 1 | 438 | 7 | 347 | 0.13 | 1.21 | -0.08 | 0.41 | 1.37 | 0.43 | 0.59 |
| 1 | 1 | 439 | 31 | 374 | 0.28 | -0.27 | 0.87 | 0.49 | -0.05 | 2.64 | -1.24 |
| 1 | 1 | 440 | 17 | 233 | 0.2 | -0.48 | 0.68 | -0.10 | -0.13 | 0.43 | 0.60 |
Now let us understand what each column in the above table means:
Segment.Level - Level of the cell.
In this case, we have performed Vector Quantization for depth 1. Hence
Segment Level is 1.
Segment.Parent - Parent segment of
the cell.
Segment.Child (Cell.Number) - The
children of a particular cell. In this case, it is the total number of
cells at which we achieved the defined compression percentage.
n - No of points in each
cell.
Cell.ID - Cell_ID’s are generated
for the multivariate data using 1-D Sammon’s Projection
algorithm.
Quant.Error - Quantization Error
for each cell.
All the columns after this will contain centroids for each cell. They can also be called a codebook, which represents a collection of all centroids or codewords.
Now, let’s check the compression summary for HVT (map A). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.
mapA_compression_summary <- map_A[[3]]$compression_summary %>% dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapA_compression_summary)| segmentLevel | noOfCells | noOfCellsBelowQuantizationError | percentOfCellsBelowQuantizationErrorThreshold | parameters |
|---|---|---|---|---|
| 1 | 440 | 355 | 0.81 | n_cells: 440 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
As it can be seen from the table above,
81% of the cells have hit the quantization
threshold error.Since we are successfully able to attain the desired
compression percentage, so we will not further subdivide the
cells
Now let’s try to understand plotHVT function. The parameters have been explained in detail below:
plotHVT(hvt.results, line.width, color.vec, pch1 = 21, centroid.size = 3, title = NULL, maxDepth = 1)hvt.results - A list containing the
output of the HVT function which has the details of the tessellations to
be plotted.
line.width - A vector indicating
the line widths of the tessellation boundaries for each layer.
color.vec - A vector indicating the
colors of the tessellations boundaries at each layer.
pch1 - Symbol type of the centroids
of the tessellations (parent levels). Refer points (default =
21)
centroid.size - Size of centroids
of first level tessellations (default = 3).
title - Set a title for the plot
(default = NULL).
Let’s plot the Voronoi tessellation for layer 1 (map A).
muHVT::plotHVT(map_A,
line.width = c(0.4),
color.vec = c("#141B41"),
centroid.size = 0.01,
maxDepth = 1) Figure 2: The Voronoi Tessellation for layer 1 (map A) shown for the 440 cells in the dataset ’computers’
Heat Maps
We will now overlay all the features as heatmap over the Voronoi Tessellation plot for better visualization and identification of patterns, trends, and variations in the data.
Let’s have a look at the function hvtHmap that we will
use to overlay features as heatmap.
hvtHmap(hvt.results, dataset, child.level, hmap.cols, color.vec ,line.width, palette.color = 6)hvt.results - A list of results
obtained from the HVT function (map_A).
dataset - A dataframe containing
the variables to overlay as a heatmap. The user can pass an external
dataset or the dataset that was used to perform hierarchical vector
quantization. The dataset should have the same number of points as the
dataset used to perform hierarchical Vector Quantization in the HVT
function.
child.level - A number indicating
the level for which the heat map is to be plotted.
hmap.cols - The column number of
column name from the dataset indicating the variables for which the heat
map is to be plotted. To plot the quantization error as heatmap, pass
'quant_error'. Similarly to plot the no of points in each
cell as heatmap, pass 'no_of_points' as a
parameter.
color.vec - A color vector such
that length(color.vec) = child.level (default = NULL).
line.width - A line width vector
such that length(line.width) = child.level (default = NULL).
palette.color - A number indicating
the heat map color palette. 1 - rainbow, 2 - heat.colors, 3 -
terrain.colors, 4 - topo.colors, 5 - cm.colors, 6 - BlCyGrYlRd
(Blue,Cyan,Green,Yellow,Red) color (default = 6).
show.points - A boolean indicating
whether the centroids should be plotted on the tessellations (default =
FALSE).
Now let’s plot the Voronoi Tessellation with the heatmap overlaid for all the features in the torus data for better visualization and interpretation of data patterns and distributions.
The heatmaps displayed below provides a visual representation of the spatial characteristics of the computers data, allowing us to observe patterns and trends in the distribution of each of the features (n,price,speed,hd,ram,screen,ads). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the computers data.
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "n",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 3: The Voronoi Tessellation with the heat map overlaid for No. of entities in each cell
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "price",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 4: The Voronoi Tessellation with the heat map overlaid for variable ’price’ in the ’computers’ dataset
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "speed",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 5: The Voronoi Tessellation with the heat map overlaid for variable ’speed’ in the ’computers’ dataset
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "hd",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 6: The Voronoi Tessellation with the heat map overlaid for variable ’hd’ in the ’computers’ dataset
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "ram",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 7: The Voronoi Tessellation with the heat map overlaid for variable ’ram’ in the ’computers’ dataset
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "screen",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 8: The Voronoi Tessellation with the heat map overlaid for variable ’screen’ in the ’computers’ dataset
hvtHmap(
map_A,
trainComputers,
child.level = 1,
hmap.cols = "ads",
line.width = c(0.2),
color.vec = c("#141B41"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 440,
) Figure 9: The Voronoi Tessellation with the heat map overlaid for variable ’ads’ in the ’computers’ dataset
Let us try to visualize the Map B from the flow diagram below.
Figure 10: Flow map with highlighted bounding box in red around map B
In this section, we will manually figure out the novelty cells from the plotted map A and store it in identified_Novelty_cells variable.
Note: For manual selecting the novelty cells from map A, one can enhance its interactivity by adding plotly elements to the code. This will transform map A into an interactive plot, allowing users to actively engage with the data. By hovering over the centroids of the cells, a tag containing segment child information will be displayed. Users can explore the map by hovering over different cells and selectively choose the novelty cells they wish to consider.
The removeNovelty function removes the
identified novelty cell(s) from the dataset and stores those records
separately.
It takes input as the cell number (Segment.Child) of the manually
identified novelty cell(s) from the above table and the compressed HVT
map (map A). It returns a list of two items:
dataset with novelty records, and a subset of the
dataset without the novelty records.
identified_Novelty_cells <<- c(73,321,332,338,435)
output_list <- removeNovelty(identified_Novelty_cells, map_A)[1] “The following cell(s) have been removed as outliers from the dataset: 73 321 332 338 435”
data_with_novelty <- output_list[[1]]
dataset_without_novelty <- output_list[[2]]Let’s have a look at the data with novelties.For the sake of brevity, we will only show the first 10 rows.
novelty_data <- data_with_novelty
novelty_data$Row.No <- row.names(novelty_data)
novelty_data <- novelty_data %>% dplyr::select("Row.No","Cell.ID","Cell.Number","price","speed","hd","ram","screen","ads")
colnames(novelty_data) <- c("Row.No","Cell.ID","Segment.Child","price","speed","hd","ram","screen","ads")
novelty_data %>% head(100) %>%
as.data.frame() %>%
Table(scroll = T, limit = 20)| Row.No | Cell.ID | Segment.Child | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|---|---|
| 1 | 429 | 73 | 3.0762240 | 0.6794579 | 0.0421969 | 4.2031619 | 0.4307274 | 0.7120258 |
| 2 | 438 | 321 | 2.6449847 | 0.6794579 | 6.5710368 | 1.3676416 | 0.4307274 | 0.6851382 |
| 3 | 438 | 321 | 1.6578103 | 2.2969425 | 6.5710368 | 1.3676416 | 0.4307274 | -1.2507722 |
| 4 | 439 | 332 | 0.9130997 | 0.6794579 | 4.6232922 | 2.7854017 | 0.4307274 | -2.4607161 |
| 5 | 439 | 332 | 1.8569770 | 1.1076156 | 4.6232922 | 4.2031619 | 2.6404120 | -2.4607161 |
| 6 | 439 | 332 | 1.5192595 | 2.2969425 | 4.6232922 | 4.2031619 | 0.4307274 | -2.4607161 |
| 7 | 439 | 332 | 1.4482522 | 1.1076156 | 4.6232922 | 4.2031619 | 0.4307274 | -2.4607161 |
| 8 | 439 | 332 | 1.8569770 | 1.1076156 | 4.6232922 | 4.2031619 | 2.6404120 | -2.4607161 |
| 9 | 439 | 332 | 1.1555636 | 2.2969425 | 4.6232922 | 2.7854017 | 0.4307274 | -2.4607161 |
| 10 | 439 | 332 | 1.7773103 | 1.1076156 | 4.6232922 | 4.2031619 | 0.4307274 | -2.4607161 |
| 11 | 439 | 332 | 1.3460710 | 0.6794579 | 4.6232922 | 4.2031619 | 0.4307274 | -2.4607161 |
| 12 | 439 | 332 | 1.4482522 | 1.1076156 | 4.6232922 | 4.2031619 | 0.4307274 | -2.4607161 |
| 13 | 218 | 338 | -1.4959522 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.2993903 |
| 14 | 218 | 338 | -1.5115391 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -1.9767386 |
| 15 | 218 | 338 | -1.1668940 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.2859465 |
| 16 | 218 | 338 | -1.4959522 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.4472723 |
| 17 | 218 | 338 | -1.5981334 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.2859465 |
| 18 | 218 | 338 | -1.1668940 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.2993903 |
| 19 | 218 | 338 | -1.1668940 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -2.4472723 |
| 20 | 218 | 338 | -1.3227637 | 0.6794579 | 0.4473277 | -0.7589986 | -0.6741149 | -1.9767386 |
The plotCells function is used to plot
the Voronoi tessellation using the compressed HVT map (map A) and
highlights the identified novelty cell(s) in red on the map.
plotCells(identified_Novelty_cells, map_A,line.width = c(0.4),centroid.size = 0.01)Figure 11: The Voronoi Tessellation constructed using the compressed HVT map (map A) with the novelty cell(s) highlighted in red
We pass the dataframe with novelty records to HVT function along with other model parameters mentioned below to generate map B (layer2)
Model Parameters
colnames(data_with_novelty) <- c("Cell.ID","Segment.Child","price","speed","hd","ram","screen","ads")
dataset_with_novelty <- data_with_novelty[,-1:-2]
map_B <- list()
mapA_scale_summary = map_A[[3]]$scale_summary
map_B <- muHVT::HVT(dataset_with_novelty,
n_cells = 5,
depth = 1,
quant.err = 0.2,
projection.scale = 10,
normalize = F,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans",
diagnose = F
)The datatable displayed below is the summary from map B (layer 2).
summaryTable(map_B[[3]]$summary,scroll = T,limit = 500)| Segment.Level | Segment.Parent | Segment.Child | n | Cell.ID | Quant.Error | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 1 | 1 | 2 | 5 | 0.01 | -1.55 | -0.08 | 0.45 | -0.76 | -0.67 | -1.98 |
| 1 | 1 | 2 | 9 | 3 | 0.49 | 1.48 | 1.28 | 4.62 | 3.89 | 0.92 | -2.46 |
| 1 | 1 | 3 | 2 | 2 | 0.38 | 2.15 | 1.49 | 6.57 | 1.37 | 0.43 | -0.28 |
| 1 | 1 | 4 | 9 | 4 | 0.08 | -1.33 | 0.68 | 0.45 | -0.76 | -0.67 | -2.22 |
| 1 | 1 | 5 | 2 | 1 | 0.66 | 4.29 | 0.68 | 1.55 | 4.20 | 1.54 | 0.60 |
Let us try to visualize the compressed Map C from the flow diagram below.
Figure 12:Flow map with highlighted bounding box in red around compressed map C
With the Novelties removed, we construct another hierarchical Voronoi tessellation map C layer 2 on the dataset without Novelty and below mentioned model parameters.
Model Parameters
map_C <- list()
mapA_scale_summary = map_A[[3]]$scale_summary
map_C <- muHVT::HVT(dataset_without_novelty,
n_cells = 10,
depth = 2,
quant.err = 0.2,
projection.scale = 10,
normalize = F,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans",
diagnose = F,
scale_summary = mapA_scale_summary)Now let’s check the compression summary for HVT (map C). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.
mapC_compression_summary <- map_C[[3]]$compression_summary %>% dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapC_compression_summary)| segmentLevel | noOfCells | noOfCellsBelowQuantizationError | percentOfCellsBelowQuantizationErrorThreshold | parameters |
|---|---|---|---|---|
| 1 | 10 | 0 | 0 | n_cells: 10 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
| 2 | 100 | 7 | 0.07 | n_cells: 10 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
As it can be seen from the table above,
0% of the cells have hit the quantization
threshold error in level 1 and 7% of the
cells have hit the quantization threshold error in level 2
Since, we are yet to achive atleast 80% compression at depth 2. Let’s try to compress again using the below mentioned set of model parameters.
Model Parameters
map_C <- list()
map_C <- muHVT::HVT(dataset_without_novelty,
n_cells = 23, #23
depth = 2,
quant.err = 0.2,
projection.scale = 10,
normalize = F,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans",
diagnose = F,
scale_summary = mapA_scale_summary)The datatable displayed below is the summary from map C (layer2).
summaryTable(map_C[[3]]$summary,scroll = T,limit = 500)| Segment.Level | Segment.Parent | Segment.Child | n | Cell.ID | Quant.Error | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 1 | 1 | 391 | 512 | 0.56 | -1.27 | -1.04 | -0.91 | -0.83 | -0.59 | 0.86 |
| 1 | 1 | 2 | 105 | 96 | 1.02 | 2.08 | 0.75 | 0.11 | 0.71 | 2.64 | 0.56 |
| 1 | 1 | 3 | 190 | 55 | 1.37 | 1.48 | 0.02 | 2.32 | 2.63 | 0.12 | -0.22 |
| 1 | 1 | 4 | 86 | 15 | 0.94 | 0.87 | 1.27 | 2.61 | 2.57 | 1.33 | -1.76 |
| 1 | 1 | 5 | 257 | 326 | 0.49 | 0.41 | 0.45 | -0.08 | -0.20 | -0.67 | 0.66 |
| 1 | 1 | 6 | 116 | 201 | 0.77 | 1.57 | 0.45 | -0.24 | 0.27 | -0.30 | -1.30 |
| 1 | 1 | 7 | 235 | 228 | 0.58 | 0.85 | -0.82 | 0.42 | 1.37 | -0.47 | 0.52 |
| 1 | 1 | 8 | 149 | 185 | 0.84 | -0.79 | 1.73 | 0.76 | -0.22 | -0.28 | -1.75 |
| 1 | 1 | 9 | 353 | 306 | 0.69 | -0.77 | 0.55 | 0.28 | -0.22 | -0.17 | -0.79 |
| 1 | 1 | 10 | 234 | 297 | 0.54 | 0.25 | 0.50 | -0.11 | -0.16 | 0.43 | 0.66 |
| 1 | 1 | 11 | 330 | 478 | 0.56 | -1.18 | -0.79 | -0.70 | -0.76 | -0.52 | -0.69 |
| 1 | 1 | 12 | 373 | 434 | 0.6 | -0.50 | 0.40 | -0.84 | -0.84 | -0.56 | 0.66 |
| 1 | 1 | 13 | 209 | 147 | 0.81 | 1.32 | 0.99 | 0.36 | 1.37 | -0.01 | 0.70 |
| 1 | 1 | 14 | 337 | 471 | 0.36 | -0.51 | -0.97 | -0.87 | -0.75 | -0.67 | 0.57 |
| 1 | 1 | 15 | 283 | 404 | 0.71 | 0.06 | -0.40 | -0.78 | -0.40 | -0.42 | -1.42 |
| 1 | 1 | 16 | 168 | 142 | 0.85 | 0.13 | 0.66 | 0.48 | -0.02 | 2.64 | -0.97 |
| 1 | 1 | 17 | 285 | 384 | 0.53 | -0.04 | -0.96 | 0.05 | -0.12 | -0.65 | 0.54 |
| 1 | 1 | 18 | 118 | 301 | 0.75 | 0.35 | 2.30 | -0.13 | -0.29 | -0.28 | 0.82 |
| 1 | 1 | 19 | 256 | 414 | 0.47 | -0.47 | -0.84 | -0.43 | -0.48 | 0.43 | 0.60 |
| 1 | 1 | 20 | 78 | 57 | 1 | 0.18 | 1.67 | 2.05 | 1.31 | 0.39 | -1.97 |
| 1 | 1 | 21 | 171 | 360 | 0.64 | 0.24 | -0.46 | -0.37 | -0.41 | 2.64 | 0.59 |
| 1 | 1 | 22 | 170 | 130 | 1.28 | 0.28 | -0.01 | 1.64 | 0.99 | 0.21 | -0.62 |
| 1 | 1 | 23 | 89 | 219 | 0.88 | 1.98 | 0.59 | 0.20 | -0.20 | 0.12 | 0.58 |
| 2 | 1 | 1 | 19 | 528 | 0.09 | -1.31 | -1.17 | -1.19 | -1.11 | -0.67 | 0.99 |
| 2 | 1 | 2 | 18 | 479 | 0.31 | -1.42 | -0.87 | -0.21 | -0.72 | -0.67 | 0.53 |
| 2 | 1 | 3 | 28 | 521 | 0.1 | -1.47 | -1.22 | -1.11 | -0.76 | -0.67 | 0.89 |
| 2 | 1 | 4 | 10 | 530 | 0.14 | -1.84 | -1.04 | -1.20 | -1.11 | -0.67 | 0.79 |
| 2 | 1 | 5 | 27 | 519 | 0.1 | -1.20 | -1.09 | -1.19 | -1.11 | -0.67 | 0.50 |
| 2 | 1 | 6 | 13 | 524 | 0.18 | -1.36 | -1.10 | -0.99 | -0.87 | 0.43 | 1.36 |
| 2 | 1 | 7 | 5 | 523 | 0.04 | -1.19 | -1.27 | -0.78 | -0.76 | -0.67 | 1.57 |
| 2 | 1 | 8 | 23 | 494 | 0.06 | -1.18 | -0.89 | -0.78 | -0.76 | -0.67 | 0.72 |
| 2 | 1 | 9 | 27 | 514 | 0.14 | -1.35 | -1.27 | -1.08 | -0.76 | -0.67 | 0.39 |
| 2 | 1 | 10 | 35 | 510 | 0.1 | -1.08 | -1.27 | -0.88 | -0.76 | -0.67 | 0.90 |
| 2 | 1 | 11 | 6 | 529 | 0.15 | -1.22 | -0.62 | -1.19 | -1.11 | -0.67 | 1.50 |
| 2 | 1 | 12 | 10 | 513 | 0.07 | -0.95 | -0.89 | -0.85 | -0.76 | -0.67 | 1.57 |
| 2 | 1 | 13 | 10 | 517 | 0.22 | -1.70 | -1.19 | -0.73 | -0.83 | -0.67 | 0.57 |
| 2 | 1 | 14 | 11 | 472 | 0.17 | -1.19 | -1.13 | -0.33 | -0.05 | -0.67 | 0.97 |
| 2 | 1 | 15 | 11 | 507 | 0.19 | -1.00 | -0.93 | -1.17 | -0.63 | -0.67 | 0.97 |
| 2 | 1 | 16 | 7 | 531 | 0.07 | -1.58 | -1.27 | -1.18 | -0.96 | -0.67 | 1.57 |
| 2 | 1 | 17 | 26 | 499 | 0.09 | -1.31 | -0.89 | -0.92 | -0.76 | -0.67 | 0.36 |
| 2 | 1 | 18 | 23 | 515 | 0.07 | -0.99 | -0.89 | -1.19 | -1.11 | -0.67 | 0.92 |
| 2 | 1 | 19 | 13 | 520 | 0.14 | -1.56 | -0.98 | -0.74 | -0.76 | -0.67 | 1.34 |
| 2 | 1 | 20 | 15 | 489 | 0.12 | -1.05 | -0.89 | -0.26 | -0.76 | -0.67 | 1.31 |
| 2 | 1 | 21 | 16 | 503 | 0.18 | -1.48 | -0.79 | -0.73 | -0.76 | -0.67 | 0.76 |
| 2 | 1 | 22 | 15 | 518 | 0.26 | -1.37 | -1.07 | -1.09 | -1.07 | 0.43 | 0.66 |
| 2 | 1 | 23 | 23 | 500 | 0.07 | -1.03 | -0.89 | -0.82 | -0.76 | -0.67 | 1.06 |
| 2 | 2 | 1 | 4 | 108 | 0.12 | 2.31 | 0.68 | -0.12 | -0.05 | 2.64 | 1.31 |
| 2 | 2 | 2 | 3 | 118 | 0.11 | 2.53 | 0.68 | -0.47 | -0.76 | 2.64 | 0.36 |
| 2 | 2 | 3 | 7 | 99 | 0.21 | 1.06 | 0.80 | 0.29 | 1.37 | 2.64 | 0.35 |
| 2 | 2 | 4 | 2 | 89 | 0.06 | 1.09 | 2.30 | 0.25 | -0.05 | 2.64 | 1.57 |
| 2 | 2 | 5 | 7 | 59 | 0.13 | 3.21 | 0.68 | -0.27 | 1.37 | 2.64 | 0.53 |
| 2 | 2 | 6 | 7 | 88 | 0.11 | 1.85 | 0.68 | 0.16 | 1.37 | 2.64 | 0.85 |
| 2 | 2 | 7 | 2 | 145 | 0.11 | 1.14 | 0.68 | 0.10 | -0.05 | 2.64 | 1.36 |
| 2 | 2 | 8 | 3 | 38 | 0.11 | 3.32 | 0.68 | -0.27 | 1.37 | 2.64 | -0.78 |
| 2 | 2 | 9 | 6 | 100 | 0.17 | 2.86 | 0.68 | 0.10 | -0.05 | 2.64 | 0.48 |
| 2 | 2 | 10 | 2 | 24 | 0.11 | 3.50 | 2.30 | 1.23 | -0.05 | 2.64 | -0.23 |
| 2 | 2 | 11 | 2 | 12 | 0.38 | 4.64 | 0.68 | 1.36 | 0.66 | 2.64 | 0.37 |
| 2 | 2 | 12 | 6 | 56 | 0.25 | 1.52 | 2.30 | 0.32 | 1.37 | 2.64 | 0.77 |
| 2 | 2 | 13 | 4 | 50 | 0.34 | 1.64 | 0.49 | 1.90 | 1.37 | 2.64 | 0.14 |
| 2 | 2 | 14 | 13 | 76 | 0.22 | 2.42 | 0.68 | 0.03 | 1.37 | 2.64 | 0.53 |
| 2 | 2 | 15 | 4 | 103 | 0.12 | 1.08 | 2.30 | 0.14 | -0.05 | 2.64 | 0.52 |
| 2 | 2 | 16 | 11 | 167 | 0.13 | 1.23 | 0.68 | -0.20 | -0.05 | 2.64 | 0.59 |
| 2 | 2 | 17 | 1 | 107 | 0 | 1.99 | -0.08 | 0.49 | -0.05 | 2.64 | -0.62 |
| 2 | 2 | 18 | 4 | 84 | 0.15 | 1.52 | 0.68 | 0.34 | 1.37 | 2.64 | 1.36 |
| 2 | 2 | 19 | 5 | 169 | 0.17 | 1.81 | -0.08 | -0.29 | -0.19 | 2.64 | 0.99 |
| 2 | 2 | 20 | 4 | 70 | 0.11 | 2.69 | -0.89 | -0.27 | 1.37 | 2.64 | 0.69 |
| 2 | 2 | 21 | 4 | 122 | 0.11 | 1.99 | 0.68 | 0.22 | -0.05 | 2.64 | 0.57 |
| 2 | 2 | 22 | 3 | 60 | 0.12 | 2.74 | -0.89 | -0.27 | 1.37 | 2.64 | -0.78 |
| 2 | 2 | 23 | 1 | 195 | 0 | 1.63 | 0.68 | -0.95 | -0.76 | 2.64 | 0.69 |
| 2 | 3 | 1 | 5 | 39 | 0.04 | 1.54 | -0.08 | 2.29 | 2.79 | 0.43 | -0.81 |
| 2 | 3 | 2 | 3 | 9 | 0.05 | 1.71 | -0.89 | 2.29 | 2.79 | 2.64 | -0.55 |
| 2 | 3 | 3 | 10 | 51 | 0.07 | 1.10 | -0.08 | 2.29 | 2.79 | 0.43 | -0.56 |
| 2 | 3 | 4 | 5 | 32 | 0.39 | 3.74 | -0.10 | 2.60 | 1.37 | -0.67 | 0.96 |
| 2 | 3 | 5 | 3 | 19 | 0.02 | 2.16 | 2.30 | 2.29 | 2.79 | -0.67 | 0.04 |
| 2 | 3 | 6 | 1 | 20 | 0 | 2.03 | 2.30 | 2.29 | 2.79 | 0.43 | 0.35 |
| 2 | 3 | 7 | 16 | 49 | 0.18 | 1.49 | -0.89 | 2.29 | 2.79 | 0.43 | -0.06 |
| 2 | 3 | 8 | 7 | 40 | 0.09 | 1.80 | -0.08 | 2.29 | 2.79 | 0.43 | -0.26 |
| 2 | 3 | 9 | 2 | 54 | 0.05 | 2.90 | 0.68 | 3.73 | -0.05 | -0.67 | 0.69 |
| 2 | 3 | 10 | 1 | 91 | 0 | 1.16 | 0.68 | 2.29 | 1.37 | -0.67 | 0.04 |
| 2 | 3 | 11 | 7 | 46 | 0.05 | 1.16 | 0.68 | 2.29 | 2.79 | 0.43 | -0.49 |
| 2 | 3 | 12 | 4 | 77 | 0.12 | 1.66 | 0.68 | 2.44 | 1.37 | 0.43 | 0.80 |
| 2 | 3 | 13 | 7 | 37 | 0.08 | 1.63 | 0.68 | 2.29 | 2.79 | 0.43 | -0.73 |
| 2 | 3 | 14 | 12 | 53 | 0.1 | 1.69 | 0.68 | 2.29 | 2.79 | -0.67 | 0.38 |
| 2 | 3 | 15 | 2 | 43 | 0.04 | 3.07 | 0.68 | 2.29 | 1.37 | -0.67 | -0.87 |
| 2 | 3 | 16 | 5 | 23 | 0.04 | 2.06 | 2.30 | 2.29 | 2.79 | -0.67 | 0.35 |
| 2 | 3 | 17 | 12 | 45 | 0.06 | 0.92 | 0.68 | 2.29 | 2.79 | 0.43 | -0.86 |
| 2 | 3 | 18 | 5 | 82 | 0.13 | 1.35 | 0.68 | 2.47 | 1.37 | 0.43 | 0.11 |
| 2 | 3 | 19 | 5 | 17 | 0.04 | 2.23 | 2.30 | 2.29 | 2.79 | -0.67 | 0.69 |
| 2 | 3 | 20 | 24 | 61 | 0.12 | 1.30 | -0.89 | 2.29 | 2.79 | -0.67 | 0.33 |
| 2 | 3 | 21 | 9 | 36 | 0.18 | 2.07 | 0.68 | 2.29 | 2.79 | 0.43 | -0.03 |
| 2 | 3 | 22 | 13 | 52 | 0.11 | 0.65 | -0.08 | 2.29 | 2.79 | 0.43 | -1.03 |
| 2 | 3 | 23 | 32 | 48 | 0.13 | 1.15 | -0.89 | 2.29 | 2.79 | 0.43 | -0.66 |
| 2 | 4 | 1 | 5 | 7 | 0.11 | 0.51 | 2.30 | 2.05 | 1.37 | 2.64 | -2.33 |
| 2 | 4 | 2 | 8 | 18 | 0.07 | 0.52 | 0.68 | 3.07 | 2.79 | 0.43 | -2.36 |
| 2 | 4 | 3 | 6 | 41 | 0.05 | 0.79 | 0.68 | 2.29 | 2.79 | 0.43 | -1.24 |
| 2 | 4 | 4 | 2 | 3 | 0 | 1.08 | 2.30 | 3.07 | 2.79 | 0.43 | -2.46 |
| 2 | 4 | 5 | 3 | 8 | 0.13 | 1.91 | 0.43 | 2.29 | 2.79 | 2.64 | -0.55 |
| 2 | 4 | 6 | 1 | 11 | 0 | 0.01 | 1.11 | 3.07 | 2.79 | 0.43 | -2.45 |
| 2 | 4 | 7 | 3 | 26 | 0.03 | 0.73 | -0.08 | 3.07 | 2.79 | 0.43 | -1.98 |
| 2 | 4 | 8 | 1 | 4 | 0 | 0.99 | -0.08 | 3.07 | 2.79 | 2.64 | -1.98 |
| 2 | 4 | 9 | 3 | 2 | 0.05 | 1.48 | 2.30 | 2.29 | 2.79 | 2.64 | -1.08 |
| 2 | 4 | 10 | 2 | 14 | 0.02 | 0.81 | 0.68 | 3.07 | 2.79 | 0.43 | -2.46 |
| 2 | 4 | 11 | 1 | 6 | 0 | 0.74 | 2.30 | 3.07 | 2.79 | 0.43 | -1.98 |
| 2 | 4 | 12 | 2 | 16 | 0.02 | 0.89 | 1.11 | 3.07 | 2.79 | 0.43 | -1.98 |
| 2 | 4 | 13 | 2 | 29 | 0.04 | 0.84 | 2.30 | 1.70 | 1.37 | 2.64 | -1.02 |
| 2 | 4 | 14 | 6 | 5 | 0.03 | 0.74 | 2.30 | 3.07 | 2.79 | 0.43 | -2.40 |
| 2 | 4 | 15 | 7 | 1 | 0.23 | 1.03 | 1.16 | 3.07 | 2.79 | 2.64 | -2.34 |
| 2 | 4 | 16 | 5 | 25 | 0.02 | 1.02 | 2.30 | 2.29 | 2.79 | 0.43 | -1.23 |
| 2 | 4 | 17 | 6 | 22 | 0.25 | 0.85 | -0.22 | 2.29 | 2.79 | 2.64 | -0.97 |
| 2 | 4 | 18 | 2 | 13 | 0.05 | 1.21 | 0.68 | 2.29 | 2.79 | 2.64 | -1.10 |
| 2 | 4 | 19 | 5 | 30 | 0.11 | 0.32 | 0.68 | 2.17 | 1.37 | 2.64 | -2.29 |
| 2 | 4 | 20 | 6 | 10 | 0.04 | 0.69 | 1.11 | 3.07 | 2.79 | 0.43 | -2.32 |
| 2 | 4 | 21 | 1 | 34 | 0 | 0.30 | -0.08 | 2.29 | 1.37 | 2.64 | -1.98 |
| 2 | 4 | 22 | 6 | 21 | 0.06 | 1.46 | 2.30 | 2.29 | 2.79 | 0.43 | -1.01 |
| 2 | 4 | 23 | 3 | 28 | 0.01 | 1.00 | 2.30 | 2.29 | 2.79 | 0.43 | -0.79 |
| 2 | 5 | 1 | 24 | 300 | 0.13 | 0.57 | 0.68 | 0.13 | -0.05 | -0.67 | 0.72 |
| 2 | 5 | 2 | 9 | 332 | 0.08 | -0.44 | 0.68 | -0.04 | -0.05 | -0.67 | 0.21 |
| 2 | 5 | 3 | 5 | 310 | 0.12 | 0.37 | -0.08 | 0.29 | -0.05 | -0.67 | -0.14 |
| 2 | 5 | 4 | 21 | 312 | 0.17 | 0.35 | 0.86 | 0.11 | -0.05 | -0.67 | 1.39 |
| 2 | 5 | 5 | 6 | 311 | 0.06 | -0.28 | 0.68 | 0.45 | -0.05 | -0.67 | 0.30 |
| 2 | 5 | 6 | 8 | 287 | 0.15 | 0.45 | 0.68 | 0.16 | -0.05 | -0.67 | -0.20 |
| 2 | 5 | 7 | 5 | 305 | 0.06 | -0.07 | 0.68 | 0.45 | -0.05 | -0.67 | 0.48 |
| 2 | 5 | 8 | 7 | 333 | 0.12 | 0.31 | 0.68 | -0.57 | -0.05 | -0.67 | 0.57 |
| 2 | 5 | 9 | 12 | 319 | 0.06 | -0.12 | 0.68 | 0.04 | -0.05 | -0.67 | 0.14 |
| 2 | 5 | 10 | 13 | 358 | 0.15 | -0.04 | -0.08 | 0.00 | -0.05 | -0.67 | 0.83 |
| 2 | 5 | 11 | 11 | 371 | 0.16 | 0.41 | 0.68 | -0.40 | -0.76 | -0.67 | 0.94 |
| 2 | 5 | 12 | 7 | 314 | 0.16 | 1.02 | 0.68 | -0.39 | -0.15 | -0.67 | 1.12 |
| 2 | 5 | 13 | 17 | 363 | 0.2 | 0.66 | 0.63 | -0.69 | -0.76 | -0.67 | 0.15 |
| 2 | 5 | 14 | 15 | 278 | 0.11 | 1.06 | 0.68 | 0.06 | -0.05 | -0.67 | 0.76 |
| 2 | 5 | 15 | 8 | 356 | 0.16 | 0.13 | 0.79 | -0.13 | -0.76 | -0.67 | 0.38 |
| 2 | 5 | 16 | 11 | 275 | 0.19 | 1.06 | 0.68 | -0.09 | -0.05 | -0.67 | 0.19 |
| 2 | 5 | 17 | 23 | 334 | 0.13 | 0.51 | -0.08 | 0.01 | -0.05 | -0.67 | 0.91 |
| 2 | 5 | 18 | 17 | 324 | 0.11 | 0.69 | -0.08 | -0.08 | -0.05 | -0.67 | 0.46 |
| 2 | 5 | 19 | 12 | 402 | 0.15 | 0.15 | -0.08 | -0.20 | -0.76 | -0.67 | 0.94 |
| 2 | 5 | 20 | 7 | 386 | 0.14 | 1.08 | -0.08 | -0.62 | -0.76 | -0.67 | 0.63 |
| 2 | 5 | 21 | 12 | 336 | 0.2 | -0.17 | 0.68 | -0.04 | -0.05 | -0.67 | 0.86 |
| 2 | 5 | 22 | 7 | 364 | 0.13 | 0.24 | -0.08 | -0.05 | -0.05 | -0.67 | 1.57 |
| 2 | 5 | 23 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 6 | 1 | 6 | 184 | 0.1 | 2.46 | 0.68 | 0.21 | -0.05 | -0.67 | -0.87 |
| 2 | 6 | 2 | 9 | 133 | 0.29 | 1.43 | 0.51 | -0.44 | 1.37 | 0.43 | -1.15 |
| 2 | 6 | 3 | 2 | 233 | 0.08 | 1.08 | 0.68 | 0.47 | -0.05 | -0.67 | -0.71 |
| 2 | 6 | 4 | 9 | 247 | 0.22 | 0.90 | 0.59 | -0.69 | -0.13 | 0.43 | -1.39 |
| 2 | 6 | 5 | 2 | 194 | 0.08 | 2.30 | 0.30 | -0.32 | -0.05 | -0.67 | -1.71 |
| 2 | 6 | 6 | 8 | 238 | 0.26 | 1.04 | 0.58 | -0.29 | -0.14 | 0.43 | -0.77 |
| 2 | 6 | 7 | 9 | 127 | 0.06 | 2.98 | 0.68 | 0.28 | -0.05 | -0.67 | -1.67 |
| 2 | 6 | 8 | 4 | 165 | 0.11 | 1.26 | -0.08 | -0.07 | 1.37 | -0.67 | -1.37 |
| 2 | 6 | 9 | 5 | 124 | 0.19 | 1.76 | 0.53 | -0.29 | 1.37 | -0.67 | -1.66 |
| 2 | 6 | 10 | 1 | 101 | 0 | 3.77 | 0.68 | 0.15 | -0.05 | -0.67 | -1.72 |
| 2 | 6 | 11 | 5 | 292 | 0.22 | 1.44 | 0.53 | -0.77 | -0.33 | -0.67 | -1.67 |
| 2 | 6 | 12 | 1 | 365 | 0 | 1.35 | -0.08 | -0.67 | -0.76 | -0.67 | -0.62 |
| 2 | 6 | 13 | 4 | 149 | 0.06 | 1.43 | 0.68 | 0.15 | 1.37 | -0.67 | -0.87 |
| 2 | 6 | 14 | 10 | 197 | 0.2 | 1.57 | 0.68 | -0.47 | -0.33 | 0.43 | -1.64 |
| 2 | 6 | 15 | 3 | 159 | 0.2 | 1.48 | 2.30 | 0.28 | -0.05 | -0.67 | -0.87 |
| 2 | 6 | 16 | 3 | 280 | 0.1 | 0.85 | -0.08 | 0.22 | -0.05 | -0.67 | -0.68 |
| 2 | 6 | 17 | 9 | 191 | 0.15 | 1.09 | -0.93 | -0.29 | 1.37 | -0.67 | -1.62 |
| 2 | 6 | 18 | 4 | 140 | 0.08 | 3.03 | -0.08 | 0.15 | -0.05 | -0.67 | -1.65 |
| 2 | 6 | 19 | 7 | 294 | 0.05 | 0.73 | 0.68 | -0.69 | -0.05 | -0.67 | -1.60 |
| 2 | 6 | 20 | 3 | 112 | 0.09 | 2.96 | 0.68 | 0.12 | -0.05 | 0.43 | -1.72 |
| 2 | 6 | 21 | 2 | 270 | 0.04 | 1.61 | 0.68 | -0.71 | -0.05 | -0.67 | -0.87 |
| 2 | 6 | 22 | 8 | 279 | 0.09 | 0.84 | 0.68 | -0.36 | -0.05 | -0.67 | -0.84 |
| 2 | 6 | 23 | 2 | 237 | 0.08 | 2.30 | -0.49 | 0.15 | -0.05 | -0.67 | -1.12 |
| 2 | 7 | 1 | 8 | 210 | 0.16 | 0.51 | -1.18 | 0.46 | 1.37 | 0.43 | 1.10 |
| 2 | 7 | 2 | 16 | 231 | 0.12 | 1.01 | -0.91 | 0.35 | 1.37 | -0.67 | 0.40 |
| 2 | 7 | 3 | 12 | 211 | 0.1 | 1.39 | -0.89 | 0.37 | 1.37 | -0.67 | 0.29 |
| 2 | 7 | 4 | 8 | 209 | 0.06 | 0.35 | -0.89 | 1.20 | 1.37 | -0.67 | 0.35 |
| 2 | 7 | 5 | 5 | 263 | 0.05 | 0.58 | -0.89 | 0.45 | 1.37 | -0.67 | 1.57 |
| 2 | 7 | 6 | 12 | 202 | 0.18 | 0.49 | -1.02 | 0.55 | 1.37 | 0.43 | 0.36 |
| 2 | 7 | 7 | 14 | 264 | 0.08 | 0.43 | -0.89 | 0.45 | 1.37 | -0.67 | 0.64 |
| 2 | 7 | 8 | 2 | 166 | 0.09 | 0.99 | -0.08 | 0.45 | 1.37 | 0.43 | 0.26 |
| 2 | 7 | 9 | 11 | 216 | 0.14 | 0.82 | -1.03 | 0.15 | 1.37 | -0.67 | -0.85 |
| 2 | 7 | 10 | 20 | 220 | 0.09 | 1.20 | -0.91 | 0.45 | 1.37 | -0.67 | 0.84 |
| 2 | 7 | 11 | 9 | 265 | 0.07 | 0.48 | -1.27 | 0.45 | 1.37 | -0.67 | 0.66 |
| 2 | 7 | 12 | 14 | 190 | 0.21 | 1.09 | -0.89 | 0.28 | 1.37 | 0.43 | 0.63 |
| 2 | 7 | 13 | 9 | 222 | 0.09 | 0.82 | -1.06 | 0.45 | 1.37 | -0.67 | -0.07 |
| 2 | 7 | 14 | 7 | 168 | 0.2 | 0.97 | -1.00 | 0.03 | 1.37 | 0.43 | -0.83 |
| 2 | 7 | 15 | 9 | 204 | 0.2 | 0.72 | -0.08 | 0.45 | 1.37 | -0.55 | 1.43 |
| 2 | 7 | 16 | 17 | 243 | 0.09 | 0.82 | -0.98 | 0.46 | 1.37 | -0.67 | 0.95 |
| 2 | 7 | 17 | 7 | 262 | 0.04 | 0.22 | -0.89 | 0.45 | 1.37 | -0.67 | 0.04 |
| 2 | 7 | 18 | 7 | 285 | 0.07 | 0.28 | -1.22 | 0.45 | 1.37 | -0.67 | 1.12 |
| 2 | 7 | 19 | 6 | 179 | 0.12 | 1.31 | -0.08 | 0.35 | 1.37 | -0.67 | -0.26 |
| 2 | 7 | 20 | 17 | 189 | 0.14 | 1.26 | -0.08 | 0.46 | 1.37 | -0.67 | 0.84 |
| 2 | 7 | 21 | 3 | 316 | 0.02 | 0.10 | -1.27 | 0.45 | 1.37 | -0.67 | 1.57 |
| 2 | 7 | 22 | 11 | 183 | 0.11 | 1.44 | -0.08 | 0.34 | 1.37 | -0.67 | 0.42 |
| 2 | 7 | 23 | 11 | 251 | 0.09 | 0.72 | -1.24 | 0.29 | 1.37 | -0.67 | 0.36 |
| 2 | 8 | 1 | 5 | 116 | 0.08 | -0.74 | 0.68 | 1.70 | -0.05 | 0.43 | -2.35 |
| 2 | 8 | 2 | 2 | 282 | 0 | -1.43 | 0.68 | 0.45 | -0.76 | 0.43 | -2.29 |
| 2 | 8 | 3 | 8 | 182 | 0.09 | -0.47 | 2.30 | 0.49 | -0.05 | 0.43 | -0.85 |
| 2 | 8 | 4 | 10 | 236 | 0.19 | -0.61 | 2.30 | 0.49 | -0.05 | -0.67 | -1.01 |
| 2 | 8 | 5 | 13 | 87 | 0.15 | -0.66 | 2.30 | 1.70 | -0.05 | -0.16 | -2.28 |
| 2 | 8 | 6 | 4 | 117 | 0.13 | -1.29 | 0.49 | 2.29 | -0.76 | -0.67 | -2.17 |
| 2 | 8 | 7 | 2 | 64 | 0.02 | -1.12 | 2.30 | 2.29 | -0.76 | -0.67 | -2.37 |
| 2 | 8 | 8 | 3 | 104 | 0.06 | -0.17 | 2.30 | 2.29 | -0.05 | -0.67 | -0.98 |
| 2 | 8 | 9 | 4 | 170 | 0.08 | -0.90 | 1.11 | 1.35 | -0.05 | 0.43 | -1.21 |
| 2 | 8 | 10 | 7 | 146 | 0.07 | -0.91 | 2.30 | 0.47 | -0.05 | -0.67 | -2.34 |
| 2 | 8 | 11 | 12 | 409 | 0.11 | -0.78 | 2.30 | -0.33 | -0.76 | -0.67 | -1.03 |
| 2 | 8 | 12 | 7 | 241 | 0.1 | -0.19 | 2.30 | 0.33 | -0.05 | -0.67 | -0.89 |
| 2 | 8 | 13 | 4 | 258 | 0.09 | 0.04 | 2.30 | -0.08 | -0.05 | -0.67 | -1.04 |
| 2 | 8 | 14 | 9 | 330 | 0.09 | -1.17 | 2.30 | 0.45 | -0.76 | -0.67 | -2.26 |
| 2 | 8 | 15 | 6 | 212 | 0.13 | -0.74 | 0.68 | 0.30 | -0.05 | 0.43 | -2.40 |
| 2 | 8 | 16 | 7 | 175 | 0.07 | -0.53 | 2.30 | 0.49 | -0.05 | 0.43 | -1.23 |
| 2 | 8 | 17 | 4 | 109 | 0.04 | -0.38 | 2.30 | 1.70 | -0.05 | 0.43 | -1.05 |
| 2 | 8 | 18 | 10 | 144 | 0.09 | -0.89 | 0.68 | 1.70 | -0.05 | -0.67 | -2.20 |
| 2 | 8 | 19 | 1 | 283 | 0 | -1.59 | 0.68 | -0.19 | -0.05 | 0.43 | -2.29 |
| 2 | 8 | 20 | 7 | 123 | 0.17 | -0.94 | 2.30 | 0.46 | -0.35 | 0.43 | -2.27 |
| 2 | 8 | 21 | 3 | 329 | 0.06 | -0.86 | 2.30 | -0.29 | -0.76 | 0.43 | -0.98 |
| 2 | 8 | 22 | 21 | 277 | 0.11 | -1.16 | 0.68 | 0.48 | -0.05 | -0.67 | -2.25 |
| 2 | 8 | 23 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 9 | 1 | 33 | 331 | 0.22 | -0.57 | -0.08 | 0.41 | -0.05 | -0.67 | -0.71 |
| 2 | 9 | 2 | 10 | 368 | 0.19 | -0.75 | 0.68 | -0.60 | -0.76 | 0.43 | -0.42 |
| 2 | 9 | 3 | 13 | 423 | 0.19 | -1.02 | 0.68 | -0.69 | -0.70 | -0.67 | -0.83 |
| 2 | 9 | 4 | 8 | 335 | 0.2 | -0.08 | 0.68 | -0.49 | -0.23 | -0.67 | -0.86 |
| 2 | 9 | 5 | 15 | 377 | 0.21 | -1.31 | -0.08 | 0.30 | -0.76 | 0.43 | -0.98 |
| 2 | 9 | 6 | 8 | 224 | 0.04 | -0.74 | 1.11 | 0.50 | -0.05 | 0.43 | -1.22 |
| 2 | 9 | 7 | 22 | 261 | 0.13 | -0.22 | 0.68 | 0.29 | -0.05 | 0.43 | -0.60 |
| 2 | 9 | 8 | 14 | 381 | 0.28 | -0.89 | 0.57 | -0.09 | -0.51 | -0.67 | -0.23 |
| 2 | 9 | 9 | 14 | 257 | 0.08 | -0.64 | 0.68 | 0.49 | -0.05 | 0.43 | -0.76 |
| 2 | 9 | 10 | 13 | 354 | 0.27 | -1.37 | 0.33 | 0.47 | -0.27 | -0.67 | -0.93 |
| 2 | 9 | 11 | 9 | 248 | 0.16 | -1.00 | 0.68 | 0.57 | -0.05 | 0.43 | -1.22 |
| 2 | 9 | 12 | 23 | 323 | 0.13 | -0.51 | 0.68 | 0.02 | -0.05 | -0.67 | -0.46 |
| 2 | 9 | 13 | 10 | 344 | 0.25 | -1.39 | 0.68 | -0.30 | -0.48 | 0.43 | -1.23 |
| 2 | 9 | 14 | 11 | 267 | 0.11 | -1.03 | 0.68 | 0.41 | -0.05 | 0.43 | -0.85 |
| 2 | 9 | 15 | 20 | 422 | 0.17 | -1.43 | 0.68 | -0.23 | -0.72 | -0.67 | -1.02 |
| 2 | 9 | 16 | 23 | 206 | 0.26 | -0.52 | 0.70 | 1.25 | -0.05 | 0.43 | -0.79 |
| 2 | 9 | 17 | 8 | 304 | 0.13 | -0.86 | 0.68 | -0.12 | -0.05 | 0.43 | -0.39 |
| 2 | 9 | 18 | 8 | 240 | 0.05 | -0.72 | 1.11 | 0.50 | -0.05 | 0.43 | -0.83 |
| 2 | 9 | 19 | 32 | 303 | 0.35 | -0.82 | 0.68 | 0.33 | 0.04 | -0.67 | -1.07 |
| 2 | 9 | 20 | 13 | 317 | 0.16 | -1.11 | 0.68 | 0.34 | -0.76 | 0.43 | -0.90 |
| 2 | 9 | 21 | 10 | 308 | 0.22 | -0.75 | -0.24 | 0.33 | -0.05 | 0.43 | -0.69 |
| 2 | 9 | 22 | 33 | 296 | 0.2 | -0.39 | 0.68 | 0.55 | -0.05 | -0.67 | -0.56 |
| 2 | 9 | 23 | 3 | 196 | 0.03 | -0.83 | -0.08 | 1.70 | -0.05 | -0.67 | -1.98 |
| 2 | 10 | 1 | 4 | 269 | 0.07 | 0.51 | 0.68 | -0.55 | -0.05 | 0.43 | -0.62 |
| 2 | 10 | 2 | 8 | 327 | 0.1 | -0.43 | 0.68 | -0.12 | -0.05 | 0.43 | 1.46 |
| 2 | 10 | 3 | 23 | 318 | 0.15 | 0.23 | -0.08 | -0.23 | -0.05 | 0.43 | 0.69 |
| 2 | 10 | 4 | 3 | 346 | 0.12 | -0.04 | 0.68 | -0.74 | -0.76 | 0.43 | -0.31 |
| 2 | 10 | 5 | 9 | 271 | 0.17 | 0.14 | 0.73 | 0.34 | -0.05 | 0.43 | 1.57 |
| 2 | 10 | 6 | 10 | 273 | 0.12 | 0.92 | 0.68 | -0.54 | -0.05 | 0.43 | 0.74 |
| 2 | 10 | 7 | 9 | 299 | 0.11 | 0.93 | -0.08 | -0.36 | -0.05 | 0.43 | 0.66 |
| 2 | 10 | 8 | 6 | 307 | 0.08 | -0.01 | 0.68 | -0.52 | -0.05 | 0.43 | 0.38 |
| 2 | 10 | 9 | 8 | 289 | 0.15 | 0.67 | -0.08 | -0.27 | -0.05 | 0.43 | -0.15 |
| 2 | 10 | 10 | 13 | 325 | 0.26 | 0.16 | -0.08 | 0.00 | -0.16 | 0.43 | 1.23 |
| 2 | 10 | 11 | 13 | 315 | 0.18 | -0.46 | 0.68 | -0.14 | -0.05 | 0.43 | 0.64 |
| 2 | 10 | 12 | 10 | 259 | 0.12 | 0.83 | 0.68 | -0.14 | -0.05 | 0.43 | 0.32 |
| 2 | 10 | 13 | 7 | 234 | 0.06 | -0.20 | 0.68 | 1.23 | -0.05 | 0.43 | 0.35 |
| 2 | 10 | 14 | 12 | 260 | 0.16 | 0.74 | 0.72 | 0.08 | -0.05 | 0.43 | 0.89 |
| 2 | 10 | 15 | 8 | 249 | 0.12 | 0.66 | 0.68 | 0.48 | -0.14 | 0.43 | 0.81 |
| 2 | 10 | 16 | 13 | 268 | 0.09 | 0.04 | 0.68 | 0.49 | -0.05 | 0.43 | 0.60 |
| 2 | 10 | 17 | 11 | 256 | 0.11 | 0.31 | 0.68 | 0.24 | -0.05 | 0.43 | -0.06 |
| 2 | 10 | 18 | 6 | 342 | 0.17 | 0.82 | 0.30 | -0.76 | -0.76 | 0.43 | 0.64 |
| 2 | 10 | 19 | 8 | 302 | 0.15 | 0.22 | 0.68 | -0.33 | -0.05 | 0.43 | 1.20 |
| 2 | 10 | 20 | 12 | 370 | 0.24 | -0.28 | 0.75 | -0.51 | -0.76 | 0.43 | 0.84 |
| 2 | 10 | 21 | 19 | 281 | 0.11 | 0.29 | 0.68 | -0.07 | -0.05 | 0.43 | 0.64 |
| 2 | 10 | 22 | 9 | 290 | 0.13 | -0.34 | 0.68 | 0.15 | -0.05 | 0.43 | 0.20 |
| 2 | 10 | 23 | 13 | 351 | 0.11 | 0.21 | 0.68 | -0.70 | -0.76 | 0.43 | 0.66 |
| 2 | 11 | 1 | 10 | 411 | 0.22 | -1.09 | -0.93 | 0.23 | -0.12 | -0.67 | -0.72 |
| 2 | 11 | 2 | 14 | 502 | 0.14 | -1.13 | -0.97 | -1.01 | -0.86 | -0.67 | -0.09 |
| 2 | 11 | 3 | 7 | 481 | 0.2 | -0.81 | -0.08 | -1.20 | -1.11 | -0.36 | -0.96 |
| 2 | 11 | 4 | 19 | 516 | 0.16 | -0.95 | -0.95 | -1.13 | -0.91 | -0.67 | -1.64 |
| 2 | 11 | 5 | 14 | 482 | 0.12 | -1.58 | -0.89 | -0.16 | -0.76 | -0.67 | -0.68 |
| 2 | 11 | 6 | 24 | 498 | 0.16 | -1.36 | -0.94 | -0.85 | -0.77 | -0.67 | -0.46 |
| 2 | 11 | 7 | 17 | 527 | 0.25 | -1.29 | -1.25 | -1.06 | -0.84 | -0.61 | -1.65 |
| 2 | 11 | 8 | 23 | 484 | 0.15 | -1.01 | -0.92 | -0.84 | -0.79 | -0.67 | -0.72 |
| 2 | 11 | 9 | 18 | 509 | 0.1 | -1.67 | -0.91 | -0.78 | -0.76 | -0.67 | -0.75 |
| 2 | 11 | 10 | 10 | 435 | 0.15 | -0.94 | -0.08 | -0.59 | -0.69 | -0.67 | -0.89 |
| 2 | 11 | 11 | 19 | 491 | 0.25 | -1.54 | -0.93 | -0.50 | -0.74 | -0.67 | -0.11 |
| 2 | 11 | 12 | 12 | 452 | 0.25 | -1.47 | -0.76 | -0.12 | -0.79 | 0.43 | -0.55 |
| 2 | 11 | 13 | 8 | 445 | 0.17 | -1.22 | -0.08 | -0.43 | -0.67 | -0.67 | -0.19 |
| 2 | 11 | 14 | 8 | 427 | 0.08 | -0.60 | -0.89 | -0.75 | -0.05 | -0.67 | -0.70 |
| 2 | 11 | 15 | 17 | 466 | 0.16 | -0.52 | -0.98 | -0.84 | -0.76 | -0.67 | -0.67 |
| 2 | 11 | 16 | 17 | 511 | 0.16 | -1.09 | -1.00 | -1.06 | -0.88 | -0.67 | -1.12 |
| 2 | 11 | 17 | 8 | 437 | 0.08 | -0.69 | -0.08 | -0.75 | -0.76 | -0.67 | -0.48 |
| 2 | 11 | 18 | 9 | 465 | 0.09 | -1.33 | -0.08 | -0.78 | -0.76 | -0.67 | -0.56 |
| 2 | 11 | 19 | 10 | 525 | 0.18 | -1.69 | -1.19 | -1.13 | -0.90 | -0.67 | -0.23 |
| 2 | 11 | 20 | 30 | 473 | 0.29 | -1.06 | -0.87 | -0.89 | -0.79 | 0.43 | -0.61 |
| 2 | 11 | 21 | 6 | 450 | 0.1 | -1.62 | -0.08 | -0.06 | -0.76 | -0.67 | -0.70 |
| 2 | 11 | 22 | 18 | 457 | 0.26 | -1.05 | -0.89 | -0.10 | -0.68 | -0.67 | -0.16 |
| 2 | 11 | 23 | 12 | 453 | 0.09 | -1.43 | -0.08 | -0.29 | -0.76 | -0.67 | -1.02 |
| 2 | 12 | 1 | 25 | 397 | 0.2 | -0.09 | 0.68 | -0.76 | -0.70 | -0.67 | 0.48 |
| 2 | 12 | 2 | 13 | 394 | 0.18 | -0.08 | 0.68 | -0.81 | -0.79 | -0.67 | -0.16 |
| 2 | 12 | 3 | 13 | 413 | 0.17 | 0.17 | 0.68 | -1.08 | -0.87 | -0.67 | 0.72 |
| 2 | 12 | 4 | 21 | 474 | 0.12 | -0.56 | 0.68 | -1.19 | -1.10 | -0.67 | 0.96 |
| 2 | 12 | 5 | 15 | 442 | 0.11 | -0.32 | 0.68 | -1.19 | -1.09 | -0.67 | 0.31 |
| 2 | 12 | 6 | 12 | 458 | 0.26 | -0.88 | 0.68 | -0.73 | -0.70 | -0.67 | 1.24 |
| 2 | 12 | 7 | 17 | 405 | 0.25 | -0.57 | 0.68 | -0.86 | -0.90 | 0.43 | 0.44 |
| 2 | 12 | 8 | 23 | 421 | 0.14 | 0.08 | -0.08 | -0.77 | -0.76 | -0.67 | 0.62 |
| 2 | 12 | 9 | 24 | 468 | 0.16 | -0.66 | -0.08 | -1.13 | -0.98 | -0.67 | 0.37 |
| 2 | 12 | 10 | 6 | 433 | 0.13 | -0.26 | -0.08 | -0.91 | -0.88 | -0.67 | -0.11 |
| 2 | 12 | 11 | 8 | 449 | 0.09 | -0.21 | 0.84 | -0.78 | -0.76 | -0.67 | 1.57 |
| 2 | 12 | 12 | 10 | 454 | 0.22 | -0.75 | 0.72 | -0.93 | -0.94 | 0.43 | 1.25 |
| 2 | 12 | 13 | 19 | 428 | 0.17 | -0.83 | -0.08 | -0.26 | -0.68 | -0.67 | 0.55 |
| 2 | 12 | 14 | 8 | 506 | 0.12 | -0.51 | 1.05 | -1.19 | -1.11 | -0.67 | 1.36 |
| 2 | 12 | 15 | 9 | 443 | 0.16 | -0.77 | -0.08 | -0.08 | -0.76 | -0.67 | 1.22 |
| 2 | 12 | 16 | 21 | 441 | 0.19 | -0.46 | -0.08 | -0.80 | -0.62 | -0.67 | 0.85 |
| 2 | 12 | 17 | 21 | 436 | 0.14 | -0.85 | 0.68 | -0.87 | -0.76 | -0.67 | 0.44 |
| 2 | 12 | 18 | 27 | 492 | 0.15 | -0.84 | -0.08 | -1.17 | -1.05 | -0.67 | 0.90 |
| 2 | 12 | 19 | 9 | 495 | 0.15 | -0.81 | -0.08 | -0.91 | -0.88 | -0.67 | 1.48 |
| 2 | 12 | 20 | 19 | 431 | 0.24 | -0.99 | 0.68 | -0.80 | -0.78 | -0.67 | -0.34 |
| 2 | 12 | 21 | 16 | 387 | 0.2 | -0.62 | 0.68 | -0.11 | -0.67 | -0.67 | 0.49 |
| 2 | 12 | 22 | 10 | 477 | 0.19 | -0.79 | -0.08 | -1.19 | -1.11 | 0.43 | 0.74 |
| 2 | 12 | 23 | 27 | 415 | 0.13 | -0.11 | 0.76 | -0.76 | -0.76 | -0.67 | 1.01 |
| 2 | 13 | 1 | 10 | 138 | 0.21 | 0.94 | 0.68 | 0.67 | 1.37 | 0.43 | -0.06 |
| 2 | 13 | 2 | 14 | 131 | 0.21 | 2.10 | 0.68 | 0.25 | 1.37 | 0.43 | 0.77 |
| 2 | 13 | 3 | 4 | 198 | 0.02 | 0.61 | 0.68 | 0.45 | 1.37 | -0.67 | 0.69 |
| 2 | 13 | 4 | 2 | 47 | 0.19 | 1.96 | 1.49 | 2.29 | 1.37 | 0.43 | 1.36 |
| 2 | 13 | 5 | 8 | 143 | 0.08 | 1.28 | 1.11 | 0.45 | 1.37 | -0.67 | 1.36 |
| 2 | 13 | 6 | 4 | 98 | 0.2 | 0.98 | 2.30 | 0.73 | 1.37 | 0.43 | 0.00 |
| 2 | 13 | 7 | 10 | 155 | 0.13 | 1.55 | 0.68 | -0.28 | 1.37 | 0.43 | 0.33 |
| 2 | 13 | 8 | 24 | 151 | 0.12 | 1.45 | 0.68 | 0.22 | 1.37 | 0.43 | 0.72 |
| 2 | 13 | 9 | 5 | 164 | 0.11 | 1.30 | -0.08 | 0.40 | 1.37 | 0.43 | 0.73 |
| 2 | 13 | 10 | 12 | 176 | 0.12 | 1.08 | 0.68 | -0.15 | 1.37 | 0.43 | 0.72 |
| 2 | 13 | 11 | 10 | 94 | 0.13 | 1.39 | 2.30 | 0.44 | 1.37 | 0.43 | 1.23 |
| 2 | 13 | 12 | 12 | 154 | 0.12 | 1.75 | 0.68 | 0.36 | 1.37 | -0.67 | 0.29 |
| 2 | 13 | 13 | 4 | 174 | 0.1 | 0.56 | 0.68 | 0.82 | 1.37 | -0.67 | 0.19 |
| 2 | 13 | 14 | 18 | 153 | 0.27 | 0.89 | 0.75 | 0.30 | 1.37 | 0.43 | 1.33 |
| 2 | 13 | 15 | 10 | 172 | 0.22 | 0.58 | 0.77 | 0.34 | 1.37 | 0.43 | 0.47 |
| 2 | 13 | 16 | 8 | 177 | 0.08 | 0.88 | 0.68 | 0.45 | 1.37 | -0.67 | 1.41 |
| 2 | 13 | 17 | 2 | 125 | 0.09 | 2.71 | 0.68 | 0.34 | 1.37 | -0.67 | 0.67 |
| 2 | 13 | 18 | 8 | 95 | 0.07 | 1.62 | 2.30 | 0.45 | 1.37 | -0.67 | 1.41 |
| 2 | 13 | 19 | 7 | 173 | 0.07 | 1.27 | 0.68 | 0.45 | 1.37 | -0.67 | 0.74 |
| 2 | 13 | 20 | 7 | 128 | 0.1 | 1.78 | 0.68 | 0.38 | 1.37 | 0.43 | -0.02 |
| 2 | 13 | 21 | 16 | 111 | 0.2 | 1.15 | 2.30 | 0.59 | 1.37 | -0.67 | 0.18 |
| 2 | 13 | 22 | 10 | 152 | 0.11 | 1.78 | 0.68 | 0.46 | 1.37 | -0.67 | 0.90 |
| 2 | 13 | 23 | 4 | 162 | 0.07 | 1.30 | 0.68 | 0.46 | 1.37 | -0.67 | 0.05 |
| 2 | 14 | 1 | 46 | 475 | 0.08 | -0.74 | -0.89 | -0.78 | -0.76 | -0.67 | 0.52 |
| 2 | 14 | 2 | 36 | 469 | 0.16 | -0.24 | -0.94 | -0.83 | -0.76 | -0.67 | 0.94 |
| 2 | 14 | 3 | 19 | 439 | 0.18 | -0.52 | -0.95 | -0.74 | -0.05 | -0.67 | 0.60 |
| 2 | 14 | 4 | 13 | 459 | 0.15 | 0.21 | -1.04 | -0.93 | -0.76 | -0.67 | 0.60 |
| 2 | 14 | 5 | 24 | 501 | 0.14 | -0.59 | -1.00 | -1.16 | -0.89 | -0.67 | 0.87 |
| 2 | 14 | 6 | 44 | 480 | 0.13 | -0.63 | -0.93 | -0.78 | -0.76 | -0.67 | 0.98 |
| 2 | 14 | 7 | 35 | 493 | 0.13 | -0.68 | -0.91 | -1.17 | -0.98 | -0.67 | 0.42 |
| 2 | 14 | 8 | 19 | 451 | 0.1 | -0.20 | -0.95 | -0.74 | -0.76 | -0.67 | -0.06 |
| 2 | 14 | 9 | 19 | 483 | 0.14 | -0.72 | -1.03 | -0.93 | -0.85 | -0.67 | -0.07 |
| 2 | 14 | 10 | 34 | 488 | 0.13 | -0.61 | -1.27 | -0.93 | -0.76 | -0.67 | 0.54 |
| 2 | 14 | 11 | 11 | 455 | 0.18 | -0.85 | -0.89 | -0.17 | -0.69 | -0.67 | 0.45 |
| 2 | 14 | 12 | 37 | 462 | 0.1 | -0.26 | -0.91 | -0.85 | -0.76 | -0.67 | 0.39 |
| 2 | 14 | 13 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 14 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 15 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 16 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 17 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 18 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 19 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 20 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 21 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 22 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 14 | 23 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 15 | 1 | 13 | 341 | 0.17 | 0.19 | -0.92 | -0.52 | -0.05 | 0.43 | -0.93 |
| 2 | 15 | 2 | 16 | 361 | 0.16 | 0.24 | 0.68 | -0.91 | -0.45 | -0.67 | -1.62 |
| 2 | 15 | 3 | 5 | 456 | 0.11 | -0.29 | -0.89 | -0.70 | -0.76 | -0.67 | -1.12 |
| 2 | 15 | 4 | 26 | 461 | 0.2 | -0.35 | -0.08 | -1.04 | -0.81 | -0.67 | -1.59 |
| 2 | 15 | 5 | 16 | 444 | 0.3 | -0.40 | 0.68 | -1.15 | -0.94 | -0.61 | -1.48 |
| 2 | 15 | 6 | 15 | 486 | 0.08 | -0.50 | -0.89 | -0.92 | -0.76 | -0.67 | -1.66 |
| 2 | 15 | 7 | 5 | 203 | 0.2 | 0.69 | -1.12 | -0.43 | 1.37 | -0.23 | -1.58 |
| 2 | 15 | 8 | 9 | 426 | 0.33 | -0.40 | -0.75 | -0.77 | -0.68 | 0.43 | -1.23 |
| 2 | 15 | 9 | 14 | 369 | 0.19 | 0.28 | -0.92 | -0.38 | -0.05 | -0.67 | -0.90 |
| 2 | 15 | 10 | 11 | 464 | 0.07 | -0.06 | -0.89 | -0.82 | -0.76 | -0.67 | -1.66 |
| 2 | 15 | 11 | 5 | 460 | 0.11 | 0.54 | -0.89 | -0.88 | -0.76 | -0.67 | -1.70 |
| 2 | 15 | 12 | 7 | 410 | 0.17 | 0.51 | -0.08 | -0.92 | -0.76 | -0.67 | -1.57 |
| 2 | 15 | 13 | 17 | 352 | 0.15 | 0.35 | -0.89 | -0.72 | -0.13 | 0.43 | -1.66 |
| 2 | 15 | 14 | 11 | 420 | 0.08 | 0.02 | -1.03 | -0.67 | -0.05 | -0.67 | -1.65 |
| 2 | 15 | 15 | 13 | 396 | 0.13 | 0.43 | -0.89 | -0.71 | -0.05 | -0.67 | -1.63 |
| 2 | 15 | 16 | 4 | 379 | 0.23 | 0.75 | -0.69 | -0.65 | -0.76 | 0.15 | -0.67 |
| 2 | 15 | 17 | 7 | 322 | 0.21 | 0.47 | 0.24 | -0.72 | -0.66 | 0.43 | -1.52 |
| 2 | 15 | 18 | 25 | 343 | 0.22 | 0.36 | -0.08 | -0.63 | -0.05 | -0.67 | -1.53 |
| 2 | 15 | 19 | 15 | 496 | 0.09 | -0.38 | -1.04 | -1.11 | -0.76 | -0.67 | -1.63 |
| 2 | 15 | 20 | 14 | 380 | 0.3 | 0.03 | -0.08 | -0.55 | -0.46 | -0.67 | -0.82 |
| 2 | 15 | 21 | 14 | 293 | 0.21 | 0.50 | -0.08 | -0.67 | -0.05 | 0.43 | -1.14 |
| 2 | 15 | 22 | 10 | 353 | 0.17 | 0.20 | 0.68 | -0.81 | -0.48 | -0.67 | -0.92 |
| 2 | 15 | 23 | 11 | 432 | 0.09 | -0.35 | -0.89 | -0.80 | -0.05 | -0.67 | -1.61 |
| 2 | 16 | 1 | 8 | 58 | 0.25 | 0.04 | 2.30 | 1.70 | -0.05 | 2.64 | -1.20 |
| 2 | 16 | 2 | 1 | 33 | 0 | 1.25 | 2.30 | 1.20 | 1.37 | 2.64 | -0.94 |
| 2 | 16 | 3 | 13 | 90 | 0.39 | 0.07 | 2.30 | 0.48 | -0.05 | 2.64 | -0.99 |
| 2 | 16 | 4 | 5 | 230 | 0.11 | 0.87 | -0.89 | -0.67 | -0.33 | 2.64 | -1.64 |
| 2 | 16 | 5 | 11 | 163 | 0.15 | -0.13 | 0.68 | 0.38 | -0.05 | 2.64 | -0.78 |
| 2 | 16 | 6 | 4 | 75 | 0.04 | 0.84 | -0.89 | 1.20 | 1.37 | 2.64 | -0.83 |
| 2 | 16 | 7 | 17 | 207 | 0.2 | -0.62 | 0.68 | 0.36 | -0.76 | 2.64 | -1.01 |
| 2 | 16 | 8 | 20 | 135 | 0.14 | -0.24 | 0.98 | 0.51 | -0.05 | 2.64 | -1.10 |
| 2 | 16 | 9 | 9 | 132 | 0.15 | 1.33 | -0.08 | -0.65 | -0.29 | 2.64 | -1.52 |
| 2 | 16 | 10 | 8 | 74 | 0.2 | 0.95 | 0.39 | 1.10 | 1.37 | 2.64 | -0.65 |
| 2 | 16 | 11 | 8 | 284 | 0.05 | -0.76 | -0.08 | 0.51 | -0.76 | 2.64 | -1.16 |
| 2 | 16 | 12 | 5 | 83 | 0.04 | 0.78 | -0.89 | 1.20 | 1.37 | 2.64 | -0.43 |
| 2 | 16 | 13 | 5 | 105 | 0.07 | -0.44 | 0.68 | 0.49 | -0.05 | 2.64 | -2.23 |
| 2 | 16 | 14 | 4 | 136 | 0.09 | 1.24 | 0.68 | -0.64 | -0.05 | 2.64 | -0.87 |
| 2 | 16 | 15 | 2 | 106 | 0.03 | 1.56 | 0.68 | -0.71 | -0.76 | 2.64 | -1.71 |
| 2 | 16 | 16 | 9 | 120 | 0.06 | -0.01 | 0.68 | 1.23 | -0.05 | 2.64 | -0.72 |
| 2 | 16 | 17 | 7 | 67 | 0.21 | 0.47 | 0.41 | 1.49 | 1.37 | 2.64 | -0.97 |
| 2 | 16 | 18 | 7 | 213 | 0.33 | 0.07 | 0.57 | -0.21 | -0.05 | 2.64 | -0.36 |
| 2 | 16 | 19 | 5 | 160 | 0.15 | 0.43 | 0.68 | 0.62 | -0.05 | 2.64 | 0.17 |
| 2 | 16 | 20 | 6 | 320 | 0.13 | -0.58 | 0.68 | -0.26 | -0.76 | 2.64 | -0.44 |
| 2 | 16 | 21 | 3 | 85 | 0.06 | 0.38 | 0.68 | 2.29 | -0.05 | 2.64 | -0.61 |
| 2 | 16 | 22 | 6 | 345 | 0.1 | -0.82 | -0.08 | 0.21 | -0.76 | 2.64 | -0.79 |
| 2 | 16 | 23 | 5 | 110 | 0.06 | 1.39 | 0.68 | -0.67 | -0.05 | 2.64 | -1.61 |
| 2 | 17 | 1 | 6 | 440 | 0.09 | -0.58 | -1.27 | 0.12 | -0.05 | -0.67 | 1.29 |
| 2 | 17 | 2 | 10 | 416 | 0.13 | -0.58 | -0.93 | -0.18 | -0.05 | -0.67 | 0.64 |
| 2 | 17 | 3 | 12 | 417 | 0.09 | -0.24 | -0.92 | 0.15 | -0.05 | -0.67 | 1.57 |
| 2 | 17 | 4 | 9 | 349 | 0.12 | 0.29 | -0.93 | 0.23 | -0.05 | -0.67 | -0.11 |
| 2 | 17 | 5 | 5 | 309 | 0.28 | 0.52 | -0.89 | 0.35 | -0.05 | 0.43 | 0.52 |
| 2 | 17 | 6 | 16 | 385 | 0.13 | 0.31 | -0.89 | -0.45 | -0.05 | -0.67 | 0.41 |
| 2 | 17 | 7 | 23 | 372 | 0.09 | 0.31 | -0.91 | 0.03 | -0.05 | -0.67 | 0.86 |
| 2 | 17 | 8 | 21 | 403 | 0.09 | -0.15 | -1.27 | 0.09 | -0.05 | -0.67 | 0.85 |
| 2 | 17 | 9 | 14 | 366 | 0.13 | -0.37 | -0.92 | 0.45 | -0.05 | -0.67 | -0.31 |
| 2 | 17 | 10 | 11 | 419 | 0.13 | -0.26 | -0.89 | -0.36 | -0.05 | -0.67 | 1.02 |
| 2 | 17 | 11 | 13 | 389 | 0.11 | -0.35 | -0.95 | 0.02 | -0.05 | -0.67 | 0.00 |
| 2 | 17 | 12 | 17 | 392 | 0.06 | -0.26 | -0.89 | 0.05 | -0.05 | -0.67 | 0.62 |
| 2 | 17 | 13 | 4 | 362 | 0.17 | 0.15 | -1.08 | 0.01 | -0.05 | -0.67 | -0.67 |
| 2 | 17 | 14 | 9 | 424 | 0.17 | 0.37 | -0.89 | -0.29 | -0.76 | -0.67 | 0.90 |
| 2 | 17 | 15 | 12 | 395 | 0.09 | -0.06 | -1.27 | -0.09 | -0.05 | -0.67 | 0.37 |
| 2 | 17 | 16 | 18 | 399 | 0.11 | -0.17 | -0.89 | 0.09 | -0.05 | -0.67 | 1.05 |
| 2 | 17 | 17 | 15 | 373 | 0.09 | -0.59 | -0.89 | 0.45 | -0.05 | -0.67 | -0.58 |
| 2 | 17 | 18 | 11 | 357 | 0.09 | 0.20 | -0.89 | 0.47 | -0.05 | -0.67 | 0.68 |
| 2 | 17 | 19 | 15 | 398 | 0.26 | 0.98 | -0.89 | -0.41 | -0.57 | -0.67 | 0.49 |
| 2 | 17 | 20 | 20 | 367 | 0.08 | 0.25 | -0.91 | 0.05 | -0.05 | -0.67 | 0.46 |
| 2 | 17 | 21 | 3 | 438 | 0.08 | -0.60 | -0.89 | 0.04 | -0.76 | -0.67 | 0.48 |
| 2 | 17 | 22 | 6 | 429 | 0.08 | -0.22 | -0.89 | -0.07 | -0.76 | -0.67 | 0.68 |
| 2 | 17 | 23 | 15 | 382 | 0.08 | -0.35 | -0.92 | 0.45 | -0.05 | -0.67 | 0.47 |
| 2 | 18 | 1 | 4 | 430 | 0.06 | -0.12 | 2.30 | -0.78 | -0.76 | -0.67 | 0.52 |
| 2 | 18 | 2 | 5 | 199 | 0.07 | 0.53 | 2.30 | 0.46 | -0.05 | 0.43 | 0.62 |
| 2 | 18 | 3 | 4 | 276 | 0.04 | -0.21 | 2.30 | 0.03 | -0.05 | -0.67 | -0.48 |
| 2 | 18 | 4 | 1 | 225 | 0 | 0.74 | 2.30 | -0.19 | -0.05 | -0.67 | -0.79 |
| 2 | 18 | 5 | 4 | 340 | 0.08 | 0.42 | 2.30 | -0.56 | -0.76 | -0.67 | -0.64 |
| 2 | 18 | 6 | 16 | 286 | 0.15 | 0.56 | 2.30 | 0.00 | -0.05 | -0.67 | 1.49 |
| 2 | 18 | 7 | 5 | 242 | 0.04 | 0.41 | 2.30 | 0.45 | -0.05 | -0.67 | 0.35 |
| 2 | 18 | 8 | 7 | 187 | 0.09 | 0.61 | 2.30 | 0.31 | -0.05 | 0.43 | 1.57 |
| 2 | 18 | 9 | 8 | 200 | 0.15 | 0.62 | 2.30 | 0.08 | -0.05 | 0.43 | 0.04 |
| 2 | 18 | 10 | 5 | 226 | 0.04 | 0.41 | 2.30 | 0.05 | -0.05 | 0.43 | 0.69 |
| 2 | 18 | 11 | 8 | 463 | 0.07 | 0.24 | 2.30 | -0.78 | -0.76 | -0.67 | 1.30 |
| 2 | 18 | 12 | 9 | 522 | 0.08 | -0.14 | 2.30 | -1.19 | -1.11 | -0.67 | 1.33 |
| 2 | 18 | 13 | 2 | 115 | 0.08 | 1.74 | 2.30 | 1.23 | -0.05 | -0.67 | -0.03 |
| 2 | 18 | 14 | 6 | 274 | 0.18 | 0.31 | 2.30 | -0.06 | -0.05 | -0.67 | 0.15 |
| 2 | 18 | 15 | 3 | 229 | 0.06 | 0.55 | 2.30 | -0.06 | -0.05 | 0.43 | 1.14 |
| 2 | 18 | 16 | 10 | 254 | 0.13 | 0.67 | 2.30 | 0.22 | -0.05 | -0.67 | 0.87 |
| 2 | 18 | 17 | 4 | 407 | 0.13 | 0.00 | 2.30 | -0.78 | -0.76 | 0.43 | 0.94 |
| 2 | 18 | 18 | 7 | 246 | 0.08 | 0.16 | 2.30 | -0.04 | -0.05 | 0.43 | 1.57 |
| 2 | 18 | 19 | 4 | 339 | 0.04 | -0.04 | 2.30 | 0.05 | -0.76 | -0.67 | 0.04 |
| 2 | 18 | 20 | 3 | 217 | 0.01 | 0.65 | 2.30 | 0.03 | -0.05 | 0.43 | 0.69 |
| 2 | 18 | 21 | 3 | 476 | 0.14 | -0.52 | 2.30 | -0.45 | -0.76 | -0.67 | 1.29 |
| 2 | 18 | 22 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 18 | 23 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 19 | 1 | 6 | 391 | 0.18 | 0.52 | -0.89 | -0.34 | -0.64 | 0.43 | 0.98 |
| 2 | 19 | 2 | 6 | 383 | 0.07 | 0.68 | -0.89 | -0.56 | -0.76 | 0.43 | 0.10 |
| 2 | 19 | 3 | 21 | 470 | 0.12 | -1.02 | -0.91 | -0.76 | -0.76 | 0.43 | 0.53 |
| 2 | 19 | 4 | 11 | 348 | 0.2 | 0.08 | -1.03 | -0.10 | -0.05 | 0.43 | 0.14 |
| 2 | 19 | 5 | 10 | 446 | 0.15 | -1.12 | -0.81 | -0.30 | -0.76 | 0.43 | 0.01 |
| 2 | 19 | 6 | 14 | 350 | 0.08 | 0.07 | -0.89 | 0.05 | -0.05 | 0.43 | 0.74 |
| 2 | 19 | 7 | 8 | 388 | 0.27 | -0.53 | -0.38 | -0.56 | -0.40 | 0.43 | -0.28 |
| 2 | 19 | 8 | 7 | 467 | 0.11 | -1.18 | -0.89 | -0.29 | -0.76 | 0.43 | 0.88 |
| 2 | 19 | 9 | 6 | 425 | 0.09 | -0.49 | -0.89 | -0.23 | -0.76 | 0.43 | 0.58 |
| 2 | 19 | 10 | 9 | 406 | 0.1 | -0.18 | -0.08 | -0.75 | -0.76 | 0.43 | 0.63 |
| 2 | 19 | 11 | 10 | 400 | 0.16 | -0.36 | -0.89 | -0.71 | -0.05 | 0.43 | 0.33 |
| 2 | 19 | 12 | 9 | 347 | 0.21 | -0.22 | -0.89 | 0.49 | -0.13 | 0.43 | 0.63 |
| 2 | 19 | 13 | 8 | 497 | 0.1 | -0.83 | -0.89 | -1.19 | -1.11 | 0.43 | 0.44 |
| 2 | 19 | 14 | 16 | 378 | 0.12 | -0.25 | -1.03 | -0.12 | -0.05 | 0.43 | 0.74 |
| 2 | 19 | 15 | 16 | 418 | 0.15 | -0.73 | -0.08 | -0.41 | -0.76 | 0.43 | 0.82 |
| 2 | 19 | 16 | 22 | 448 | 0.15 | -0.21 | -0.94 | -0.82 | -0.76 | 0.43 | 0.77 |
| 2 | 19 | 17 | 13 | 485 | 0.15 | -0.74 | -0.95 | -0.83 | -0.76 | 0.43 | 1.20 |
| 2 | 19 | 18 | 13 | 447 | 0.14 | -0.52 | -0.98 | -0.75 | -0.76 | 0.43 | 0.17 |
| 2 | 19 | 19 | 10 | 412 | 0.21 | -0.48 | -0.81 | -0.43 | -0.05 | 0.43 | 1.25 |
| 2 | 19 | 20 | 11 | 355 | 0.16 | -0.61 | -0.89 | 0.20 | -0.05 | 0.43 | -0.14 |
| 2 | 19 | 21 | 9 | 390 | 0.1 | -0.66 | -0.89 | -0.14 | -0.05 | 0.43 | 0.65 |
| 2 | 19 | 22 | 10 | 401 | 0.15 | -0.52 | -0.93 | 0.13 | -0.05 | 0.43 | 1.40 |
| 2 | 19 | 23 | 11 | 487 | 0.12 | -0.83 | -1.27 | -0.86 | -0.76 | 0.43 | 0.75 |
| 2 | 20 | 1 | 5 | 73 | 0.02 | -0.13 | 0.68 | 1.70 | 1.37 | 0.43 | -2.46 |
| 2 | 20 | 2 | 3 | 69 | 0.02 | -0.02 | 0.68 | 2.29 | 1.37 | 0.43 | -1.98 |
| 2 | 20 | 3 | 3 | 65 | 0.19 | -0.83 | 1.22 | 3.07 | -0.05 | -0.67 | -2.46 |
| 2 | 20 | 4 | 7 | 79 | 0.03 | 0.82 | 2.30 | 1.21 | 1.37 | 0.43 | -0.92 |
| 2 | 20 | 5 | 5 | 62 | 0.03 | 0.13 | 0.68 | 2.29 | 1.37 | 0.43 | -2.32 |
| 2 | 20 | 6 | 3 | 80 | 0.04 | -0.03 | 0.68 | 1.70 | 1.37 | 0.43 | -2.19 |
| 2 | 20 | 7 | 3 | 71 | 0.03 | 0.47 | 2.30 | 1.70 | 1.37 | 0.43 | -0.79 |
| 2 | 20 | 8 | 2 | 72 | 0 | 0.22 | 0.68 | 1.70 | 1.37 | 0.43 | -2.46 |
| 2 | 20 | 9 | 12 | 35 | 0.14 | 0.15 | 2.30 | 2.29 | 1.37 | 0.43 | -2.29 |
| 2 | 20 | 10 | 4 | 44 | 0.06 | -0.03 | 0.68 | 3.07 | 1.37 | 0.43 | -2.25 |
| 2 | 20 | 11 | 11 | 42 | 0.07 | 0.15 | 2.30 | 1.70 | 1.37 | 0.43 | -2.39 |
| 2 | 20 | 12 | 1 | 31 | 0 | 0.57 | 2.30 | 3.30 | 1.37 | 0.43 | -1.25 |
| 2 | 20 | 13 | 5 | 63 | 0.08 | -0.29 | 0.68 | 2.29 | 1.37 | 0.43 | -2.35 |
| 2 | 20 | 14 | 6 | 66 | 0.05 | 0.39 | 2.30 | 1.70 | 1.37 | 0.43 | -1.23 |
| 2 | 20 | 15 | 1 | 68 | 0 | 0.31 | 0.68 | 2.29 | 1.37 | 0.43 | -1.98 |
| 2 | 20 | 16 | 3 | 78 | 0.05 | 1.06 | 2.30 | 1.23 | 1.37 | 0.43 | -0.62 |
| 2 | 20 | 17 | 4 | 27 | 0.05 | 0.19 | 2.30 | 3.07 | 1.37 | 0.43 | -2.25 |
| 2 | 20 | 18 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 20 | 19 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 20 | 20 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 20 | 21 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 20 | 22 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 20 | 23 | 0 | NA | NA | NA | NA | NA | NA | NA | NA |
| 2 | 21 | 1 | 4 | 504 | 0.17 | 0.72 | -0.89 | -0.66 | -0.76 | 2.64 | 1.36 |
| 2 | 21 | 2 | 5 | 245 | 0.17 | 0.97 | 0.68 | -0.65 | -0.62 | 2.64 | 1.03 |
| 2 | 21 | 3 | 6 | 313 | 0.16 | -0.20 | 0.68 | -0.40 | -0.52 | 2.64 | 0.41 |
| 2 | 21 | 4 | 10 | 338 | 0.18 | 0.27 | -0.89 | -0.34 | -0.05 | 2.64 | 0.32 |
| 2 | 21 | 5 | 4 | 375 | 0.13 | 0.16 | -0.49 | 0.04 | -0.76 | 2.64 | 0.66 |
| 2 | 21 | 6 | 6 | 298 | 0.15 | 1.35 | -0.08 | -0.45 | -0.76 | 2.64 | 0.94 |
| 2 | 21 | 7 | 10 | 526 | 0.21 | -0.31 | -1.08 | -0.88 | -0.76 | 2.64 | 1.28 |
| 2 | 21 | 8 | 11 | 490 | 0.15 | -0.83 | -0.89 | -0.25 | -0.76 | 2.64 | -0.33 |
| 2 | 21 | 9 | 6 | 393 | 0.19 | 1.39 | -0.89 | -0.56 | -0.76 | 2.64 | -0.04 |
| 2 | 21 | 10 | 8 | 359 | 0.11 | 0.49 | -0.89 | -0.39 | -0.05 | 2.64 | 0.95 |
| 2 | 21 | 11 | 6 | 295 | 0.1 | 0.19 | -0.08 | -0.29 | -0.05 | 2.64 | 0.75 |
| 2 | 21 | 12 | 7 | 376 | 0.09 | -0.05 | -0.89 | -0.19 | -0.05 | 2.64 | 0.77 |
| 2 | 21 | 13 | 18 | 205 | 0.2 | 0.58 | 0.68 | -0.13 | -0.13 | 2.64 | 0.65 |
| 2 | 21 | 14 | 6 | 186 | 0.18 | 0.55 | 0.55 | 0.01 | -0.05 | 2.64 | 1.50 |
| 2 | 21 | 15 | 6 | 328 | 0.14 | 0.27 | -0.89 | -0.73 | -0.05 | 2.64 | -0.76 |
| 2 | 21 | 16 | 2 | 215 | 0.1 | 0.75 | -0.08 | -0.46 | -0.05 | 2.64 | -0.35 |
| 2 | 21 | 17 | 5 | 408 | 0.12 | -0.53 | -0.08 | -0.29 | -0.76 | 2.64 | 0.19 |
Now let’s check the compression summary for HVT (map C). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.
mapC_compression_summary <- map_C[[3]]$compression_summary %>% dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapC_compression_summary)| segmentLevel | noOfCells | noOfCellsBelowQuantizationError | percentOfCellsBelowQuantizationErrorThreshold | parameters |
|---|---|---|---|---|
| 1 | 23 | 0 | 0 | n_cells: 23 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
| 2 | 508 | 434 | 0.85 | n_cells: 23 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans |
As it can be seen from the table above,
0% of the cells have hit the quantization
threshold error in level 1 and 85% of the
cells have hit the quantization threshold error in level 2
Let’s plot the Voronoi tessellation for layer 2 (map C)
muHVT::plotHVT(map_C,
line.width = c(0.4,0.2),
color.vec = c("#141B41","#0582CA"),
centroid.size = 0.1,
maxDepth = 2) Figure 13: The Voronoi Tessellation for layer 2 (map C) shown for the 100 cells in the dataset ’computers’ at level 2
Heat Maps
Now let’s plot all the features for each cell at level two as a heatmap for better visualization.
The heatmaps displayed below provides a visual representation of the spatial characteristics of the computers data, allowing us to observe patterns and trends in the distribution of each of the features (n,price,speed,hd,ram,screen,ads). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the computers data.
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "n",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 14: The Voronoi Tessellation with the heat map overlaid for features No. of entities in each cell
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "price",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 15: The Voronoi Tessellation with the heat map overlaid for features price in the ’computers’ dataset
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "speed",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 16: The Voronoi Tessellation with the heat map overlaid for features speed in the ’computers’ dataset
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "hd",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 17: The Voronoi Tessellation with the heat map overlaid for features hd in the ’computers’ dataset
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "ram",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 18: The Voronoi Tessellation with the heat map overlaid for features ram in the ’computers’ dataset
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "screen",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 19: The Voronoi Tessellation with the heat map overlaid for features screen in the ’computers’ dataset
hvtHmap(
map_C,
trainComputers,
child.level = 2,
hmap.cols = "ads",
line.width = c(0.6,0.4),
color.vec = c("#141B41","#0582CA"),
palette.color = 6,
centroid.size = 0.1,
show.points = T,
quant.error.hmap = 0.2,
n_cells.hmap = 100,
) Figure 20: The Voronoi Tessellation with the heat map overlaid for features ads in the ’computers’ dataset
We now have the set of maps (map A, map B & map C) which will be used to predict which map and cell each test record is assigned to, but before that lets view our test dataset
Now once we have built the model, let us try to predict using our test dataset which cell and which layer each point belongs to.
Raw Testing Dataset
The testing dataset includes the following columns:
Let’s have a look at our randomly selected test dataset containing 1253 datapoints.
Table(head(testComputers_data))| Row.No | price | speed | hd | ram | screen | ads |
|---|---|---|---|---|---|---|
| 3 | 1595 | 25 | 170 | 4 | 15 | 94 |
| 4 | 1849 | 25 | 170 | 8 | 14 | 94 |
| 7 | 1720 | 25 | 170 | 4 | 14 | 94 |
| 10 | 2575 | 50 | 210 | 4 | 15 | 94 |
| 11 | 2195 | 33 | 170 | 8 | 15 | 94 |
| 14 | 2295 | 25 | 245 | 8 | 14 | 94 |
The predictLayerHVT function is used to score the test data using the predictive set of maps. This function takes an input - a test data and a set of maps (map A, map B, map C).
Now, Let us understand the
predictLayerHVT function.
predictLayerHVT(data,
map_A,
map_B,
map_C,
mad.threshold = 0.2,
normalize = T,
distance_metric="L1_Norm",
error_metric="max",
child.level = 1,
line.width = c(0.6, 0.4, 0.2),
color.vec = c("#141B41", "#6369D1", "#D8D2E1"),
yVar= NULL,
...)Each of the parameters of predictLayerHVT function has been explained below:
data - A dataframe containing the
test dataset. The variables from this dataset can also be used to
overlay as heatmap.
map A - A list of hvt.results.model
obtained from HVT function while performing hierarchical vector
quantization on train data.
map B - A list of hvt.results.model
obtained from HVT function while performing hierarchical vector
quantization on data with novelty which was obtained as a result of
removeNovelty function.
map C - A list of hvt.results.model
obtained from HVT function while performing hierarchical vector
quantization on data without novelty obtained as a result of
removeNovelty function.
child.level - A number indicating
the layer for which the heat map is to be plotted.(Only used if
hmap.cols is not NULL).It specifies the specific layer within the
hierarchical structure of the model.
mad.threshold - A numeric values
indicating the permissible Mean Absolute Deviation.
normalize - A logical value
indicating if the columns in your dataset should be normalized. Default
value is TRUE, which means the columns will be normalized.
distance_metric - The distance
metric can be ’Euclidean” or “Manhattan”. Euclidean is selected by
default.
error_metric - The error metric can
be “mean” or “max”. mean is selected by default.
yVar - Name of the dependent
variable(s).
... - color.vec and line.width can
be passed from here.
The function predicts based on the HVT maps - map A, map B and map C, constructed using HVT function. For each test record, the function will assign that record to Layer1 or Layer2. Layer1 contains the cell ids from map A and Layer 2 contains cell ids from map B (novelty map) and map C (map without novelty).
Prediction Algorithm
The prediction algorithm recursively calculates the distance between each point in the test dataset and the cell centroids for each level. The following steps explain the prediction method for a single point in the test dataset:
Note : The prediction algorithm will not work if some of the variables used to perform quantization are missing. In the test dataset, we should not remove any features
Let’s see which cell and layer each point belongs to.For the sake of brevity, we will only show the first 10 rows.
validation_data <- testComputers
new_predict <- predictLayerHVT(
data=validation_data,
map_A,
map_B,
map_C,
normalize = T
)prediction_output <- head(new_predict,7)
row_indices <- c(287, 397, 873)
new_df <- new_predict[row_indices, ]
Predictions <- rbind(prediction_output,new_df)
row.names(Predictions) <- NULL
Predictions%>% head(100) %>%
as.data.frame() %>%
Table(scroll = T, limit = 20)| Row.Number | Layer1.Cell.ID | Layer2.Cell.ID |
|---|---|---|
| 1 | A24 | C478 |
| 2 | A143 | C404 |
| 3 | A29 | C478 |
| 4 | A237 | C404 |
| 5 | A223 | C404 |
| 6 | A162 | C404 |
| 7 | A215 | C404 |
| 287 | A440 | B1 |
| 397 | A440 | B1 |
| 873 | A438 | B1 |
Note: From the above table, we can see that 343rd observation from the test data having Cell.ID as A399 have been identified as novelty and is mapped to 1st(B1) novelty cell in Layer2.Cell.ID column .Similarly, 367th,423rd……. test record having Cell.ID as A399 is correctly identified as novelty and gets mapped to B1 novelty cell in the Layer2.Cell.ID column respectively
summary_list <- map_A[[3]]
train_colnames <- names(summary_list[["nodes.clust"]][[1]][[1]])
scaled_test_data <- scale(
testComputers[, train_colnames],
center = summary_list$scale_summary$mean_data[train_colnames],
scale = summary_list$scale_summary$std_data[train_colnames])testComputers <- scaled_test_data
data1 <- data.frame(testComputers)
data1$Row.No <- row.names(testComputers_data)
data1 <- data1 %>% dplyr::select(Row.No,price,speed,hd,ram,screen,ads)
colnames(data1) <- c("Row.No","price_act","speed_act","hd_act","ram_act","screen_act","ads_act")
Layer2.Cell.ID <- new_predict$Layer2.Cell.ID
combined <- cbind(data1,Layer2.Cell.ID)
combined <- combined %>%
mutate(Cell.ID = gsub("[BC]", "", Layer2.Cell.ID))
mapC_summary <- map_C[[3]]$summary
df1 <- combined
df1$Cell.ID <- as.numeric(df1$Cell.ID)
df2 <- mapC_summary
merged_df <- merge(df1, df2, by = "Cell.ID", all.x = TRUE)
sorted_df <- merged_df[order(merged_df$Row.No), ]
sorted_df <- sorted_df %>% select(price_act,speed_act,hd_act,ram_act,screen_act,ads_act,Layer2.Cell.ID,price,speed,hd,ram,screen,ads)
sorted_df$Row.No <- testComputers_data$Row.No
sorted_df <- sorted_df%>% dplyr::select(Row.No,price_act,speed_act,hd_act,ram_act,screen_act,ads_act,Layer2.Cell.ID,price,speed,hd,ram,screen,ads)
colnames(sorted_df) <- c("Row.No","price_act","speed_act","hd_act","ram_act","screen_act","ads_act","Layer2.Cell.ID","price_pred","speed_pred","hd_pred","ram_pred","screen_pred","ads_pred")
sorted_df$diff <- rowMeans(abs(sorted_df[, c("price_act","speed_act","hd_act","ram_act","screen_act","ads_act")] - sorted_df[, c("price_pred","speed_pred","hd_pred","ram_pred","screen_pred","ads_pred")]))
rownames(sorted_df) <- NULL
options(scipen = 999)
sorted_df %>% head(100) %>%
as.data.frame() %>%
Table(scroll = T, limit = 10)| Row.No | price_act | speed_act | hd_act | ram_act | screen_act | ads_act | Layer2.Cell.ID | price_pred | speed_pred | hd_pred | ram_pred | screen_pred | ads_pred | diff |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 3 | -1.0785679 | -1.2710382 | -0.9472574 | -0.7589986 | 0.4307274 | -1.7213059 | C478 | -1.1829114 | -0.7922801 | -0.6999883 | -0.7557764 | -0.5234546 | -0.6945646 | 0.4690860 |
| 4 | 1.1382448 | -0.0817113 | -0.7914378 | -0.7589986 | 2.6404120 | -1.7213059 | C360 | 0.2356188 | -0.4603485 | -0.3744382 | -0.4107768 | 2.6404120 | 0.5948051 | 0.7270993 |
| 7 | 1.3374115 | -0.0817113 | 0.1512706 | 1.3676416 | 0.4307274 | -1.1163339 | C201 | 1.5745305 | 0.4514353 | -0.2367672 | 0.2737663 | -0.3026593 | -1.3041998 | 0.5289052 |
| 10 | -1.0629809 | 0.6794579 | -0.7758559 | -0.7589986 | -0.6741149 | -0.5382496 | C478 | -1.1829114 | -0.7922801 | -0.6999883 | -0.7557764 | -0.5234546 | -0.6945646 | 0.3296222 |
| 11 | -0.6455967 | 0.6794579 | 0.4473277 | -0.0501185 | -0.6741149 | -0.5382496 | C306 | -0.7716661 | 0.5467115 | 0.2755179 | -0.2228202 | -0.1702066 | -0.7871317 | 0.2260196 |
| 14 | 0.9286867 | -0.8904536 | 2.2859986 | 2.7854017 | 0.4307274 | -0.5382496 | C55 | 1.4810668 | 0.0181922 | 2.3175726 | 2.6287019 | 0.1225346 | -0.2170130 | 0.3797882 |
| 15 | 1.0845564 | -0.8904536 | 2.2859986 | 2.7854017 | 0.4307274 | -0.5382496 | C55 | 1.4810668 | 0.0181922 | 2.3175726 | 2.6287019 | 0.1225346 | -0.2170130 | 0.3538099 |
| 19 | -0.8984519 | 0.6794579 | 0.0266149 | -0.0501185 | -0.6741149 | -0.5382496 | C306 | -0.7716661 | 0.5467115 | 0.2755179 | -0.2228202 | -0.1702066 | -0.7871317 | 0.2389879 |
| 22 | -1.0716404 | 0.6794579 | 0.4940736 | -0.0501185 | -0.6741149 | -0.5382496 | C306 | -0.7716661 | 0.5467115 | 0.2755179 | -0.2228202 | -0.1702066 | -0.7871317 | 0.2627947 |
| 24 | -1.3660608 | -0.8904536 | 0.0421969 | -0.7589986 | 0.4307274 | -0.5382496 | C478 | -1.1829114 | -0.7922801 | -0.6999883 | -0.7557764 | -0.5234546 | -0.6945646 | 0.3562045 |
hist(sorted_df$diff, breaks = 20, col = "blue", main = "Mean Absolute Difference", xlab = "Difference")Figure 21: Mean Absolute Difference
We have considered computers dataset for creating a predictive set of maps to monitor entities over time using predictLayerHVT() in this vignette.
Our goal is to achieve data compression upto atleast
80%.
We construct a compressed HVT map (Map A) using the HVT() on the
training dataset by setting n_cells to 440
and quant.error to 0.2 and we were able to
attain a compression of 81%.
Based on the output of the above step, we manually identify the novelty cell(s) from the plotted map A. For this dataset, we identify the 77th,203rd,231th,262th,318th cells as the novelty cell.
We pass the identified novelty cell(s) as a parameter to the removeNovelty() along with HVT map A. The function removes that novelty cell(s) from the dataset and stores them separately. It also returns the dataset without novelty(s).
The plotCells() constructs hierarchical voronoi tessellations and highlights the identified novelty cell(s) in red.
The dataset with novelty is then passed to the HVT() to construct
another HVT map (map B). But here, we set the parameters
n_cells = 3,
depth = 1 etc. when constructing the
map.
The dataset without novelties is then passed to the HVT() to
construct another HVT map (map C). But here, we set the parameters
n_cells = 100,
depth = 2 etc. when constructing the
map.
Finally, the set of maps - map A, map B, and map C are passed to the predictLayerHVT() along with the test dataset to predict which map and what cell each test record is assigned to.
The output of predictLayerHVT is a dataset with two columns Layer1.Cell.ID and Layer2.Cell.ID. Layer1.Cell.ID contains cell ids from map A in the form A1,A2,A3…. and Layer2.Cell.ID contains cell ids from map B as B1,B2… depending on the identified novelties and map C as C1,C2,C3…..
From the output of predictLayerHVTwe can see that
343rd observation from the test data having Cell.ID as A399 have been
identified as novelty and is mapped to 1st(B1) novelty cell in
Layer2.Cell.ID column .Similarly, 367th,423rd……. test record having
Cell.ID as A399 is correctly identified as novelty and gets mapped to B1
novelty cell in the Layer2.Cell.ID column respectively.
Topology Preserving Maps : https://users.ics.aalto.fi/jhollmen/dippa/node9.html
Vector Quantization : https://en.wikipedia.org/wiki/Vector_quantization
Sammon’s Projection : http://en.wikipedia.org/wiki/Sammon_mapping
Voronoi Tessellations : http://en.wikipedia.org/wiki/Centroidal_Voronoi_tessellation